15980 1727204137.36097: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-MVC executable location = /usr/local/bin/ansible-playbook python version = 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 15980 1727204137.37389: Added group all to inventory 15980 1727204137.37391: Added group ungrouped to inventory 15980 1727204137.37396: Group all now contains ungrouped 15980 1727204137.37400: Examining possible inventory source: /tmp/network-jrl/inventory-0Xx.yml 15980 1727204137.87322: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 15980 1727204137.87399: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 15980 1727204137.87429: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 15980 1727204137.88118: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 15980 1727204137.88214: Loaded config def from plugin (inventory/script) 15980 1727204137.88216: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 15980 1727204137.88262: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 15980 1727204137.88686: Loaded config def from plugin (inventory/yaml) 15980 1727204137.88689: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 15980 1727204137.89304: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 15980 1727204137.90293: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 15980 1727204137.90298: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 15980 1727204137.90302: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 15980 1727204137.90308: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 15980 1727204137.90314: Loading data from /tmp/network-jrl/inventory-0Xx.yml 15980 1727204137.90699: /tmp/network-jrl/inventory-0Xx.yml was not parsable by auto 15980 1727204137.90816: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 15980 1727204137.91069: Loading data from /tmp/network-jrl/inventory-0Xx.yml 15980 1727204137.91178: group all already in inventory 15980 1727204137.91186: set inventory_file for managed-node1 15980 1727204137.91191: set inventory_dir for managed-node1 15980 1727204137.91192: Added host managed-node1 to inventory 15980 1727204137.91194: Added host managed-node1 to group all 15980 1727204137.91195: set ansible_host for managed-node1 15980 1727204137.91196: set ansible_ssh_extra_args for managed-node1 15980 1727204137.91199: set inventory_file for managed-node2 15980 1727204137.91202: set inventory_dir for managed-node2 15980 1727204137.91203: Added host managed-node2 to inventory 15980 1727204137.91204: Added host managed-node2 to group all 15980 1727204137.91205: set ansible_host for managed-node2 15980 1727204137.91206: set ansible_ssh_extra_args for managed-node2 15980 1727204137.91209: set inventory_file for managed-node3 15980 1727204137.91212: set inventory_dir for managed-node3 15980 1727204137.91213: Added host managed-node3 to inventory 15980 1727204137.91214: Added host managed-node3 to group all 15980 1727204137.91215: set ansible_host for managed-node3 15980 1727204137.91216: set ansible_ssh_extra_args for managed-node3 15980 1727204137.91219: Reconcile groups and hosts in inventory. 15980 1727204137.91223: Group ungrouped now contains managed-node1 15980 1727204137.91225: Group ungrouped now contains managed-node2 15980 1727204137.91227: Group ungrouped now contains managed-node3 15980 1727204137.91529: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 15980 1727204137.91889: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 15980 1727204137.91947: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 15980 1727204137.91987: Loaded config def from plugin (vars/host_group_vars) 15980 1727204137.91990: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 15980 1727204137.91999: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 15980 1727204137.92008: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 15980 1727204137.92060: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 15980 1727204137.92943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204137.93169: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 15980 1727204137.93217: Loaded config def from plugin (connection/local) 15980 1727204137.93221: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 15980 1727204137.95351: Loaded config def from plugin (connection/paramiko_ssh) 15980 1727204137.95357: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 15980 1727204137.98169: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 15980 1727204137.98219: Loaded config def from plugin (connection/psrp) 15980 1727204137.98224: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 15980 1727204138.00504: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 15980 1727204138.00556: Loaded config def from plugin (connection/ssh) 15980 1727204138.00560: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 15980 1727204138.05590: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 15980 1727204138.05650: Loaded config def from plugin (connection/winrm) 15980 1727204138.05655: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 15980 1727204138.05813: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 15980 1727204138.05893: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 15980 1727204138.06158: Loaded config def from plugin (shell/cmd) 15980 1727204138.06161: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 15980 1727204138.06193: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 15980 1727204138.06264: Loaded config def from plugin (shell/powershell) 15980 1727204138.06268: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 15980 1727204138.06328: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 15980 1727204138.07355: Loaded config def from plugin (shell/sh) 15980 1727204138.07359: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 15980 1727204138.07475: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 15980 1727204138.07751: Loaded config def from plugin (become/runas) 15980 1727204138.07754: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 15980 1727204138.08385: Loaded config def from plugin (become/su) 15980 1727204138.08389: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 15980 1727204138.08901: Loaded config def from plugin (become/sudo) 15980 1727204138.08904: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 15980 1727204138.08949: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml 15980 1727204138.10120: in VariableManager get_vars() 15980 1727204138.10149: done with get_vars() 15980 1727204138.11152: trying /usr/local/lib/python3.12/site-packages/ansible/modules 15980 1727204138.19696: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 15980 1727204138.20052: in VariableManager get_vars() 15980 1727204138.20058: done with get_vars() 15980 1727204138.20061: variable 'playbook_dir' from source: magic vars 15980 1727204138.20062: variable 'ansible_playbook_python' from source: magic vars 15980 1727204138.20063: variable 'ansible_config_file' from source: magic vars 15980 1727204138.20064: variable 'groups' from source: magic vars 15980 1727204138.20064: variable 'omit' from source: magic vars 15980 1727204138.20067: variable 'ansible_version' from source: magic vars 15980 1727204138.20067: variable 'ansible_check_mode' from source: magic vars 15980 1727204138.20068: variable 'ansible_diff_mode' from source: magic vars 15980 1727204138.20069: variable 'ansible_forks' from source: magic vars 15980 1727204138.20070: variable 'ansible_inventory_sources' from source: magic vars 15980 1727204138.20070: variable 'ansible_skip_tags' from source: magic vars 15980 1727204138.20071: variable 'ansible_limit' from source: magic vars 15980 1727204138.20072: variable 'ansible_run_tags' from source: magic vars 15980 1727204138.20072: variable 'ansible_verbosity' from source: magic vars 15980 1727204138.20117: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml 15980 1727204138.21181: in VariableManager get_vars() 15980 1727204138.21201: done with get_vars() 15980 1727204138.21364: in VariableManager get_vars() 15980 1727204138.21384: done with get_vars() 15980 1727204138.21421: in VariableManager get_vars() 15980 1727204138.21438: done with get_vars() 15980 1727204138.21651: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 15980 1727204138.22293: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 15980 1727204138.22488: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 15980 1727204138.24401: in VariableManager get_vars() 15980 1727204138.24427: done with get_vars() 15980 1727204138.25519: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 15980 1727204138.25909: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15980 1727204138.29376: in VariableManager get_vars() 15980 1727204138.29380: done with get_vars() 15980 1727204138.29383: variable 'playbook_dir' from source: magic vars 15980 1727204138.29384: variable 'ansible_playbook_python' from source: magic vars 15980 1727204138.29385: variable 'ansible_config_file' from source: magic vars 15980 1727204138.29385: variable 'groups' from source: magic vars 15980 1727204138.29386: variable 'omit' from source: magic vars 15980 1727204138.29387: variable 'ansible_version' from source: magic vars 15980 1727204138.29388: variable 'ansible_check_mode' from source: magic vars 15980 1727204138.29388: variable 'ansible_diff_mode' from source: magic vars 15980 1727204138.29389: variable 'ansible_forks' from source: magic vars 15980 1727204138.29390: variable 'ansible_inventory_sources' from source: magic vars 15980 1727204138.29391: variable 'ansible_skip_tags' from source: magic vars 15980 1727204138.29391: variable 'ansible_limit' from source: magic vars 15980 1727204138.29392: variable 'ansible_run_tags' from source: magic vars 15980 1727204138.29393: variable 'ansible_verbosity' from source: magic vars 15980 1727204138.29438: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml 15980 1727204138.29777: in VariableManager get_vars() 15980 1727204138.29801: done with get_vars() 15980 1727204138.29844: in VariableManager get_vars() 15980 1727204138.29847: done with get_vars() 15980 1727204138.29850: variable 'playbook_dir' from source: magic vars 15980 1727204138.29851: variable 'ansible_playbook_python' from source: magic vars 15980 1727204138.29852: variable 'ansible_config_file' from source: magic vars 15980 1727204138.29853: variable 'groups' from source: magic vars 15980 1727204138.29854: variable 'omit' from source: magic vars 15980 1727204138.29854: variable 'ansible_version' from source: magic vars 15980 1727204138.29855: variable 'ansible_check_mode' from source: magic vars 15980 1727204138.29856: variable 'ansible_diff_mode' from source: magic vars 15980 1727204138.29857: variable 'ansible_forks' from source: magic vars 15980 1727204138.29857: variable 'ansible_inventory_sources' from source: magic vars 15980 1727204138.29858: variable 'ansible_skip_tags' from source: magic vars 15980 1727204138.29859: variable 'ansible_limit' from source: magic vars 15980 1727204138.29860: variable 'ansible_run_tags' from source: magic vars 15980 1727204138.29860: variable 'ansible_verbosity' from source: magic vars 15980 1727204138.30018: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml 15980 1727204138.30203: in VariableManager get_vars() 15980 1727204138.30218: done with get_vars() 15980 1727204138.30445: in VariableManager get_vars() 15980 1727204138.30449: done with get_vars() 15980 1727204138.30451: variable 'playbook_dir' from source: magic vars 15980 1727204138.30452: variable 'ansible_playbook_python' from source: magic vars 15980 1727204138.30453: variable 'ansible_config_file' from source: magic vars 15980 1727204138.30454: variable 'groups' from source: magic vars 15980 1727204138.30455: variable 'omit' from source: magic vars 15980 1727204138.30456: variable 'ansible_version' from source: magic vars 15980 1727204138.30456: variable 'ansible_check_mode' from source: magic vars 15980 1727204138.30457: variable 'ansible_diff_mode' from source: magic vars 15980 1727204138.30458: variable 'ansible_forks' from source: magic vars 15980 1727204138.30464: variable 'ansible_inventory_sources' from source: magic vars 15980 1727204138.30465: variable 'ansible_skip_tags' from source: magic vars 15980 1727204138.30469: variable 'ansible_limit' from source: magic vars 15980 1727204138.30469: variable 'ansible_run_tags' from source: magic vars 15980 1727204138.30470: variable 'ansible_verbosity' from source: magic vars 15980 1727204138.30510: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml 15980 1727204138.30710: in VariableManager get_vars() 15980 1727204138.30714: done with get_vars() 15980 1727204138.30717: variable 'playbook_dir' from source: magic vars 15980 1727204138.30718: variable 'ansible_playbook_python' from source: magic vars 15980 1727204138.30718: variable 'ansible_config_file' from source: magic vars 15980 1727204138.30719: variable 'groups' from source: magic vars 15980 1727204138.30720: variable 'omit' from source: magic vars 15980 1727204138.30721: variable 'ansible_version' from source: magic vars 15980 1727204138.30722: variable 'ansible_check_mode' from source: magic vars 15980 1727204138.30722: variable 'ansible_diff_mode' from source: magic vars 15980 1727204138.30723: variable 'ansible_forks' from source: magic vars 15980 1727204138.30724: variable 'ansible_inventory_sources' from source: magic vars 15980 1727204138.30725: variable 'ansible_skip_tags' from source: magic vars 15980 1727204138.30725: variable 'ansible_limit' from source: magic vars 15980 1727204138.30726: variable 'ansible_run_tags' from source: magic vars 15980 1727204138.30727: variable 'ansible_verbosity' from source: magic vars 15980 1727204138.30873: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml 15980 1727204138.31010: in VariableManager get_vars() 15980 1727204138.31099: done with get_vars() 15980 1727204138.31152: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 15980 1727204138.31450: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 15980 1727204138.31698: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 15980 1727204138.32649: in VariableManager get_vars() 15980 1727204138.32788: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15980 1727204138.36708: in VariableManager get_vars() 15980 1727204138.36726: done with get_vars() 15980 1727204138.36977: in VariableManager get_vars() 15980 1727204138.36981: done with get_vars() 15980 1727204138.36984: variable 'playbook_dir' from source: magic vars 15980 1727204138.36984: variable 'ansible_playbook_python' from source: magic vars 15980 1727204138.36985: variable 'ansible_config_file' from source: magic vars 15980 1727204138.36986: variable 'groups' from source: magic vars 15980 1727204138.36987: variable 'omit' from source: magic vars 15980 1727204138.36988: variable 'ansible_version' from source: magic vars 15980 1727204138.36989: variable 'ansible_check_mode' from source: magic vars 15980 1727204138.36989: variable 'ansible_diff_mode' from source: magic vars 15980 1727204138.36990: variable 'ansible_forks' from source: magic vars 15980 1727204138.36991: variable 'ansible_inventory_sources' from source: magic vars 15980 1727204138.36992: variable 'ansible_skip_tags' from source: magic vars 15980 1727204138.36992: variable 'ansible_limit' from source: magic vars 15980 1727204138.36993: variable 'ansible_run_tags' from source: magic vars 15980 1727204138.36994: variable 'ansible_verbosity' from source: magic vars 15980 1727204138.37038: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml 15980 1727204138.37122: in VariableManager get_vars() 15980 1727204138.37138: done with get_vars() 15980 1727204138.37393: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 15980 1727204138.38006: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 15980 1727204138.38097: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 15980 1727204138.46889: in VariableManager get_vars() 15980 1727204138.46921: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15980 1727204138.50536: in VariableManager get_vars() 15980 1727204138.50541: done with get_vars() 15980 1727204138.50543: variable 'playbook_dir' from source: magic vars 15980 1727204138.50544: variable 'ansible_playbook_python' from source: magic vars 15980 1727204138.50545: variable 'ansible_config_file' from source: magic vars 15980 1727204138.50546: variable 'groups' from source: magic vars 15980 1727204138.50547: variable 'omit' from source: magic vars 15980 1727204138.50547: variable 'ansible_version' from source: magic vars 15980 1727204138.50548: variable 'ansible_check_mode' from source: magic vars 15980 1727204138.50549: variable 'ansible_diff_mode' from source: magic vars 15980 1727204138.50550: variable 'ansible_forks' from source: magic vars 15980 1727204138.50551: variable 'ansible_inventory_sources' from source: magic vars 15980 1727204138.50552: variable 'ansible_skip_tags' from source: magic vars 15980 1727204138.50552: variable 'ansible_limit' from source: magic vars 15980 1727204138.50553: variable 'ansible_run_tags' from source: magic vars 15980 1727204138.50554: variable 'ansible_verbosity' from source: magic vars 15980 1727204138.50795: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml 15980 1727204138.50879: in VariableManager get_vars() 15980 1727204138.50909: done with get_vars() 15980 1727204138.50954: in VariableManager get_vars() 15980 1727204138.50958: done with get_vars() 15980 1727204138.50960: variable 'playbook_dir' from source: magic vars 15980 1727204138.50961: variable 'ansible_playbook_python' from source: magic vars 15980 1727204138.50962: variable 'ansible_config_file' from source: magic vars 15980 1727204138.50963: variable 'groups' from source: magic vars 15980 1727204138.50963: variable 'omit' from source: magic vars 15980 1727204138.50964: variable 'ansible_version' from source: magic vars 15980 1727204138.51168: variable 'ansible_check_mode' from source: magic vars 15980 1727204138.51171: variable 'ansible_diff_mode' from source: magic vars 15980 1727204138.51171: variable 'ansible_forks' from source: magic vars 15980 1727204138.51172: variable 'ansible_inventory_sources' from source: magic vars 15980 1727204138.51173: variable 'ansible_skip_tags' from source: magic vars 15980 1727204138.51174: variable 'ansible_limit' from source: magic vars 15980 1727204138.51175: variable 'ansible_run_tags' from source: magic vars 15980 1727204138.51175: variable 'ansible_verbosity' from source: magic vars 15980 1727204138.51214: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml 15980 1727204138.51393: in VariableManager get_vars() 15980 1727204138.51410: done with get_vars() 15980 1727204138.51487: in VariableManager get_vars() 15980 1727204138.51499: done with get_vars() 15980 1727204138.51999: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 15980 1727204138.52015: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 15980 1727204138.52831: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 15980 1727204138.53308: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 15980 1727204138.53317: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-MVC/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 15980 1727204138.53355: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 15980 1727204138.54214: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 15980 1727204138.54848: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 15980 1727204138.54917: Loaded config def from plugin (callback/default) 15980 1727204138.54920: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 15980 1727204138.58217: Loaded config def from plugin (callback/junit) 15980 1727204138.58221: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 15980 1727204138.58489: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 15980 1727204138.58575: Loaded config def from plugin (callback/minimal) 15980 1727204138.58578: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 15980 1727204138.58631: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 15980 1727204138.58910: Loaded config def from plugin (callback/tree) 15980 1727204138.58914: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 15980 1727204138.59058: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 15980 1727204138.59061: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-MVC/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_bridge_nm.yml ************************************************** 11 plays in /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml 15980 1727204138.59297: in VariableManager get_vars() 15980 1727204138.59315: done with get_vars() 15980 1727204138.59322: in VariableManager get_vars() 15980 1727204138.59379: done with get_vars() 15980 1727204138.59386: variable 'omit' from source: magic vars 15980 1727204138.59433: in VariableManager get_vars() 15980 1727204138.59448: done with get_vars() 15980 1727204138.59677: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_bridge.yml' with nm as provider] *********** 15980 1727204138.60848: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 15980 1727204138.60935: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 15980 1727204138.61187: getting the remaining hosts for this loop 15980 1727204138.61189: done getting the remaining hosts for this loop 15980 1727204138.61193: getting the next task for host managed-node2 15980 1727204138.61197: done getting next task for host managed-node2 15980 1727204138.61199: ^ task is: TASK: Gathering Facts 15980 1727204138.61201: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204138.61203: getting variables 15980 1727204138.61204: in VariableManager get_vars() 15980 1727204138.61219: Calling all_inventory to load vars for managed-node2 15980 1727204138.61221: Calling groups_inventory to load vars for managed-node2 15980 1727204138.61224: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204138.61240: Calling all_plugins_play to load vars for managed-node2 15980 1727204138.61251: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204138.61254: Calling groups_plugins_play to load vars for managed-node2 15980 1727204138.61293: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204138.61352: done with get_vars() 15980 1727204138.61361: done getting variables 15980 1727204138.61649: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml:6 Tuesday 24 September 2024 14:55:38 -0400 (0:00:00.028) 0:00:00.028 ***** 15980 1727204138.61780: entering _queue_task() for managed-node2/gather_facts 15980 1727204138.61782: Creating lock for gather_facts 15980 1727204138.62489: worker is 1 (out of 1 available) 15980 1727204138.62500: exiting _queue_task() for managed-node2/gather_facts 15980 1727204138.62515: done queuing things up, now waiting for results queue to drain 15980 1727204138.62517: waiting for pending results... 15980 1727204138.63102: running TaskExecutor() for managed-node2/TASK: Gathering Facts 15980 1727204138.63232: in run() - task 127b8e07-fff9-5f1d-4b72-00000000007e 15980 1727204138.63412: variable 'ansible_search_path' from source: unknown 15980 1727204138.63416: calling self._execute() 15980 1727204138.63500: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204138.63774: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204138.63777: variable 'omit' from source: magic vars 15980 1727204138.63814: variable 'omit' from source: magic vars 15980 1727204138.63852: variable 'omit' from source: magic vars 15980 1727204138.64173: variable 'omit' from source: magic vars 15980 1727204138.64176: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204138.64217: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204138.64247: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204138.64274: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204138.64470: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204138.64476: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204138.64479: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204138.64481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204138.64568: Set connection var ansible_connection to ssh 15980 1727204138.64681: Set connection var ansible_pipelining to False 15980 1727204138.64691: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204138.64700: Set connection var ansible_timeout to 10 15980 1727204138.64707: Set connection var ansible_shell_type to sh 15980 1727204138.64719: Set connection var ansible_shell_executable to /bin/sh 15980 1727204138.64851: variable 'ansible_shell_executable' from source: unknown 15980 1727204138.65043: variable 'ansible_connection' from source: unknown 15980 1727204138.65046: variable 'ansible_module_compression' from source: unknown 15980 1727204138.65049: variable 'ansible_shell_type' from source: unknown 15980 1727204138.65051: variable 'ansible_shell_executable' from source: unknown 15980 1727204138.65054: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204138.65056: variable 'ansible_pipelining' from source: unknown 15980 1727204138.65058: variable 'ansible_timeout' from source: unknown 15980 1727204138.65060: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204138.65380: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15980 1727204138.65397: variable 'omit' from source: magic vars 15980 1727204138.65406: starting attempt loop 15980 1727204138.65571: running the handler 15980 1727204138.65576: variable 'ansible_facts' from source: unknown 15980 1727204138.65580: _low_level_execute_command(): starting 15980 1727204138.65582: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15980 1727204138.67049: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204138.67069: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204138.67187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 15980 1727204138.67231: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204138.67292: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204138.67459: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204138.67479: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204138.67638: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204138.69806: stdout chunk (state=3): >>>/root <<< 15980 1727204138.69836: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204138.69840: stdout chunk (state=3): >>><<< 15980 1727204138.69853: stderr chunk (state=3): >>><<< 15980 1727204138.69883: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204138.70055: _low_level_execute_command(): starting 15980 1727204138.70059: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204138.6994796-16302-2174381847922 `" && echo ansible-tmp-1727204138.6994796-16302-2174381847922="` echo /root/.ansible/tmp/ansible-tmp-1727204138.6994796-16302-2174381847922 `" ) && sleep 0' 15980 1727204138.71403: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204138.71418: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204138.71435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204138.71452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15980 1727204138.71680: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204138.71707: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204138.71813: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204138.73827: stdout chunk (state=3): >>>ansible-tmp-1727204138.6994796-16302-2174381847922=/root/.ansible/tmp/ansible-tmp-1727204138.6994796-16302-2174381847922 <<< 15980 1727204138.74575: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204138.74581: stdout chunk (state=3): >>><<< 15980 1727204138.74583: stderr chunk (state=3): >>><<< 15980 1727204138.74586: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204138.6994796-16302-2174381847922=/root/.ansible/tmp/ansible-tmp-1727204138.6994796-16302-2174381847922 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204138.74588: variable 'ansible_module_compression' from source: unknown 15980 1727204138.74591: ANSIBALLZ: Using generic lock for ansible.legacy.setup 15980 1727204138.74593: ANSIBALLZ: Acquiring lock 15980 1727204138.74596: ANSIBALLZ: Lock acquired: 139981197612416 15980 1727204138.74598: ANSIBALLZ: Creating module 15980 1727204139.65348: ANSIBALLZ: Writing module into payload 15980 1727204139.65547: ANSIBALLZ: Writing module 15980 1727204139.65595: ANSIBALLZ: Renaming module 15980 1727204139.65618: ANSIBALLZ: Done creating module 15980 1727204139.65702: variable 'ansible_facts' from source: unknown 15980 1727204139.65711: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204139.65717: _low_level_execute_command(): starting 15980 1727204139.65720: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 15980 1727204139.66504: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 15980 1727204139.66601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204139.66669: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204139.66897: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204139.67030: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 1 <<< 15980 1727204139.68732: stdout chunk (state=3): >>>PLATFORM <<< 15980 1727204139.68891: stdout chunk (state=3): >>>Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 15980 1727204139.68973: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204139.69215: stderr chunk (state=3): >>><<< 15980 1727204139.69218: stdout chunk (state=3): >>><<< 15980 1727204139.69241: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 1 debug2: Received exit status from master 0 15980 1727204139.69256 [managed-node2]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 15980 1727204139.69347: _low_level_execute_command(): starting 15980 1727204139.69433: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 15980 1727204139.69761: Sending initial data 15980 1727204139.69764: Sent initial data (1181 bytes) 15980 1727204139.71299: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204139.71422: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204139.71586: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204139.71778: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204139.75796: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"40 (Forty)\"\nID=fedora\nVERSION_ID=40\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f40\"\nPRETTY_NAME=\"Fedora Linux 40 (Forty)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:40\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f40/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=40\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=40\nSUPPORT_END=2025-05-13\n"} <<< 15980 1727204139.76377: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204139.76382: stdout chunk (state=3): >>><<< 15980 1727204139.76384: stderr chunk (state=3): >>><<< 15980 1727204139.76387: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"40 (Forty)\"\nID=fedora\nVERSION_ID=40\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f40\"\nPRETTY_NAME=\"Fedora Linux 40 (Forty)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:40\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f40/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=40\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=40\nSUPPORT_END=2025-05-13\n"} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204139.76390: variable 'ansible_facts' from source: unknown 15980 1727204139.76392: variable 'ansible_facts' from source: unknown 15980 1727204139.76394: variable 'ansible_module_compression' from source: unknown 15980 1727204139.76621: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15980vtkmnsww/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15980 1727204139.76663: variable 'ansible_facts' from source: unknown 15980 1727204139.77578: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204138.6994796-16302-2174381847922/AnsiballZ_setup.py 15980 1727204139.77900: Sending initial data 15980 1727204139.77915: Sent initial data (152 bytes) 15980 1727204139.78953: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204139.78976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204139.79240: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204139.79343: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204139.79374: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204139.81144: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15980 1727204139.81395: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204138.6994796-16302-2174381847922/AnsiballZ_setup.py" <<< 15980 1727204139.81399: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15980vtkmnsww/tmpkvw5v5ig /root/.ansible/tmp/ansible-tmp-1727204138.6994796-16302-2174381847922/AnsiballZ_setup.py <<< 15980 1727204139.81457: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15980vtkmnsww/tmpkvw5v5ig" to remote "/root/.ansible/tmp/ansible-tmp-1727204138.6994796-16302-2174381847922/AnsiballZ_setup.py" <<< 15980 1727204139.81473: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204138.6994796-16302-2174381847922/AnsiballZ_setup.py" <<< 15980 1727204139.85676: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204139.85863: stderr chunk (state=3): >>><<< 15980 1727204139.85869: stdout chunk (state=3): >>><<< 15980 1727204139.85900: done transferring module to remote 15980 1727204139.86101: _low_level_execute_command(): starting 15980 1727204139.86106: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204138.6994796-16302-2174381847922/ /root/.ansible/tmp/ansible-tmp-1727204138.6994796-16302-2174381847922/AnsiballZ_setup.py && sleep 0' 15980 1727204139.87168: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204139.87372: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204139.87377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204139.87380: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204139.87435: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204139.87442: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204139.87571: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204139.89757: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204139.89761: stdout chunk (state=3): >>><<< 15980 1727204139.89764: stderr chunk (state=3): >>><<< 15980 1727204139.89768: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204139.89771: _low_level_execute_command(): starting 15980 1727204139.89773: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204138.6994796-16302-2174381847922/AnsiballZ_setup.py && sleep 0' 15980 1727204139.91575: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204139.91633: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204139.91725: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204139.91741: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204139.91872: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204139.92090: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204139.94463: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 15980 1727204139.94470: stdout chunk (state=3): >>>import _imp # builtin <<< 15980 1727204139.94522: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 15980 1727204139.94581: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 15980 1727204139.94625: stdout chunk (state=3): >>>import 'posix' # <<< 15980 1727204139.94734: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 15980 1727204139.94741: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 15980 1727204139.94749: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 15980 1727204139.94751: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 15980 1727204139.94791: stdout chunk (state=3): >>>import '_codecs' # import 'codecs' # <<< 15980 1727204139.94819: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 15980 1727204139.94854: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 15980 1727204139.94857: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825f18530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825ee7b30> <<< 15980 1727204139.95057: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825f1aab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # <<< 15980 1727204139.95408: stdout chunk (state=3): >>>import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825d2d190> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825d2e090> import 'site' # <<< 15980 1727204139.95415: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 15980 1727204139.95870: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 15980 1727204139.95880: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 15980 1727204139.95923: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 15980 1727204139.95976: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 15980 1727204139.95984: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825d6be90> <<< 15980 1727204139.96061: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 15980 1727204139.96070: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 15980 1727204139.96099: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825d6bf50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 15980 1727204139.96106: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 15980 1727204139.96245: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825da3890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 15980 1727204139.96249: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825da3f20> <<< 15980 1727204139.96251: stdout chunk (state=3): >>>import '_collections' # <<< 15980 1727204139.96435: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825d83b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825d81280> <<< 15980 1727204139.96522: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825d69040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 15980 1727204139.96645: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 15980 1727204139.96648: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 15980 1727204139.96673: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825dc7800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825dc6420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825d82150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825dc4c80> <<< 15980 1727204139.96715: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 15980 1727204139.96718: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' <<< 15980 1727204139.96742: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825df8890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825d682f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py <<< 15980 1727204139.96774: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 15980 1727204139.97204: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8825df8d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825df8bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8825df8fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825d66e10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825df96a0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825df9370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825dfa5a0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' <<< 15980 1727204139.97208: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825e147d0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8825e15f10> <<< 15980 1727204139.97287: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 15980 1727204139.97291: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825e16db0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8825e17410> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825e16300> <<< 15980 1727204139.97313: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 15980 1727204139.97543: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8825e17e90> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825e175c0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825dfa600> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 15980 1727204139.97549: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8825b57e00> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py <<< 15980 1727204139.97798: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8825b808f0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825b80650> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8825b80920> <<< 15980 1727204139.97802: stdout chunk (state=3): >>># extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8825b80b00> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825b55fa0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 15980 1727204139.97805: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 15980 1727204139.97807: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825b82210> <<< 15980 1727204139.97930: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825b80e90> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825dfacf0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 15980 1727204139.97943: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 15980 1727204139.97986: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 15980 1727204139.97992: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 15980 1727204139.98085: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825baa5d0> <<< 15980 1727204139.98095: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 15980 1727204139.98120: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 15980 1727204139.98124: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 15980 1727204139.98220: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825bc6720> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 15980 1727204139.98299: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 15980 1727204139.98374: stdout chunk (state=3): >>>import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825bfb4d0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 15980 1727204139.98402: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 15980 1727204139.98425: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 15980 1727204139.98534: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 15980 1727204139.98551: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825c21c70> <<< 15980 1727204139.98780: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825bfb5f0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825bc73b0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825bf8ce0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825bc5760> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825b83170> <<< 15980 1727204139.98889: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 15980 1727204139.98905: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f8825bc5880> <<< 15980 1727204139.99081: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_8_3me01u/ansible_ansible.legacy.setup_payload.zip' <<< 15980 1727204139.99123: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204139.99235: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204139.99336: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 15980 1727204139.99352: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 15980 1727204139.99397: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 15980 1727204139.99440: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825a6e2d0> import '_typing' # <<< 15980 1727204139.99881: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825a451c0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825a44380> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 15980 1727204140.02517: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.03703: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825a472c0> <<< 15980 1727204140.03762: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 15980 1727204140.03789: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 15980 1727204140.03811: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8825aa1c10> <<< 15980 1727204140.03844: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825aa19a0> <<< 15980 1727204140.03892: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825aa12b0> <<< 15980 1727204140.03911: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 15980 1727204140.03975: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825aa1a30> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825a6ed50> import 'atexit' # <<< 15980 1727204140.04033: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8825aa28d0> <<< 15980 1727204140.04045: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 15980 1727204140.04048: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8825aa2b10> <<< 15980 1727204140.04059: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 15980 1727204140.04113: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 15980 1727204140.04124: stdout chunk (state=3): >>>import '_locale' # <<< 15980 1727204140.04176: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825aa2ff0> import 'pwd' # <<< 15980 1727204140.04209: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 15980 1727204140.04223: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 15980 1727204140.04285: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825900e00> <<< 15980 1727204140.04304: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8825902540> <<< 15980 1727204140.04326: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 15980 1727204140.04612: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88259032c0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88259041d0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825906f30> <<< 15980 1727204140.04671: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' <<< 15980 1727204140.04674: stdout chunk (state=3): >>>import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8825907080> <<< 15980 1727204140.04703: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825905220> <<< 15980 1727204140.04706: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 15980 1727204140.04733: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 15980 1727204140.04762: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 15980 1727204140.04792: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 15980 1727204140.04857: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 15980 1727204140.04873: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f882590af30> <<< 15980 1727204140.04928: stdout chunk (state=3): >>>import '_tokenize' # <<< 15980 1727204140.04984: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825909a00> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825909760> <<< 15980 1727204140.05000: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 15980 1727204140.05161: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f882590be60> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825905730> <<< 15980 1727204140.05194: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f882594f050> <<< 15980 1727204140.05221: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f882594f1a0> <<< 15980 1727204140.05269: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 15980 1727204140.05298: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 15980 1727204140.05360: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8825954dd0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825954b90> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 15980 1727204140.05563: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 15980 1727204140.05623: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f88259572f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825955460> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 15980 1727204140.05696: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 15980 1727204140.05769: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 15980 1727204140.05927: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f882595eab0> <<< 15980 1727204140.06221: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825957440> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f882595f8f0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f882595f740> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f882595fbc0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f882594f4a0> <<< 15980 1727204140.06249: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py <<< 15980 1727204140.06272: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 15980 1727204140.06341: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 15980 1727204140.06440: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8825963410> <<< 15980 1727204140.06808: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8825964350> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825961bb0> <<< 15980 1727204140.06863: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8825962f60> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88259617c0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 15980 1727204140.06987: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.07136: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.07171: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 15980 1727204140.07214: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 15980 1727204140.07232: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.07487: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.07663: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.08970: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.09810: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 15980 1727204140.09846: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 15980 1727204140.09879: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 15980 1727204140.09959: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f88257e85f0> <<< 15980 1727204140.10101: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 15980 1727204140.10125: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 15980 1727204140.10152: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88257e9a00> <<< 15980 1727204140.10163: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88259642f0> <<< 15980 1727204140.10223: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 15980 1727204140.10249: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.10274: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.10305: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 15980 1727204140.10308: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.10589: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.10912: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88257e9b80> # zipimport: zlib available <<< 15980 1727204140.11803: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.12189: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.12234: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.12271: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 15980 1727204140.12582: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available <<< 15980 1727204140.12585: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 15980 1727204140.12588: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.12590: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.12592: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing' # <<< 15980 1727204140.12603: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.12651: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.12747: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 15980 1727204140.13044: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.13231: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 15980 1727204140.13567: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88257ebe30> # zipimport: zlib available # zipimport: zlib available <<< 15980 1727204140.13775: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 15980 1727204140.13778: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 15980 1727204140.13959: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 15980 1727204140.14078: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f88257f1ee0> <<< 15980 1727204140.14189: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 15980 1727204140.14194: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 15980 1727204140.14227: stdout chunk (state=3): >>>import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f88257f2840> <<< 15980 1727204140.14255: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88257eae70> <<< 15980 1727204140.14271: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.14314: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.14386: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 15980 1727204140.14407: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.14491: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.14578: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.14675: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.14821: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 15980 1727204140.14908: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 15980 1727204140.15073: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 15980 1727204140.15076: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 15980 1727204140.15097: stdout chunk (state=3): >>>import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f88257f1550> <<< 15980 1727204140.15268: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88257f2a20> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 15980 1727204140.15336: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.15446: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.15490: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.15569: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 15980 1727204140.15613: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 15980 1727204140.15651: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 15980 1727204140.15691: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 15980 1727204140.15784: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 15980 1727204140.15805: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 15980 1727204140.15834: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 15980 1727204140.15891: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f882588acf0> <<< 15980 1727204140.15955: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88257fca40> <<< 15980 1727204140.16068: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88257faab0> <<< 15980 1727204140.16073: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88257fa900> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 15980 1727204140.16115: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15980 1727204140.16159: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 15980 1727204140.16505: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 15980 1727204140.16619: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 15980 1727204140.16730: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 15980 1727204140.16757: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available <<< 15980 1727204140.16777: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.16879: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.16933: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.16972: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 15980 1727204140.17346: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.17688: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825891910> <<< 15980 1727204140.17696: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 15980 1727204140.17699: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 15980 1727204140.17762: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 15980 1727204140.17788: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 15980 1727204140.17840: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8824d44440> <<< 15980 1727204140.17851: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8824d447a0> <<< 15980 1727204140.18100: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f882586d4c0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f882586ca70> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88258900b0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825893920> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py <<< 15980 1727204140.18108: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 15980 1727204140.18132: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8824d47800> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8824d470b0> <<< 15980 1727204140.18165: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' <<< 15980 1727204140.18198: stdout chunk (state=3): >>># extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8824d47290> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8824d464e0> <<< 15980 1727204140.18215: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 15980 1727204140.18333: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 15980 1727204140.18353: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8824d478f0> <<< 15980 1727204140.18373: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 15980 1727204140.18409: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 15980 1727204140.18434: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8824dae3f0> <<< 15980 1727204140.18469: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8824dac410> <<< 15980 1727204140.18541: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88258910a0> import 'ansible.module_utils.facts.timeout' # <<< 15980 1727204140.18644: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 15980 1727204140.18648: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.18650: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.18725: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 15980 1727204140.18731: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.18787: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.18870: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 15980 1727204140.18888: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 15980 1727204140.18891: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.19004: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 15980 1727204140.19026: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.19081: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 15980 1727204140.19122: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.19180: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 15980 1727204140.19216: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.19249: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.19525: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 15980 1727204140.19546: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.20456: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.21341: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 15980 1727204140.21379: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.21461: stdout chunk (state=3): >>># zipimport: zlib available<<< 15980 1727204140.21569: stdout chunk (state=3): >>> # zipimport: zlib available <<< 15980 1727204140.21634: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.21718: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # <<< 15980 1727204140.21769: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.date_time' # <<< 15980 1727204140.21773: stdout chunk (state=3): >>># zipimport: zlib available<<< 15980 1727204140.21775: stdout chunk (state=3): >>> <<< 15980 1727204140.21838: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.21906: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available<<< 15980 1727204140.21910: stdout chunk (state=3): >>> <<< 15980 1727204140.22212: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.22258: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 15980 1727204140.22546: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8824dae7b0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 15980 1727204140.22630: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8824daf3e0> <<< 15980 1727204140.22656: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 15980 1727204140.22711: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.22800: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 15980 1727204140.22825: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.23202: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available <<< 15980 1727204140.23259: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 15980 1727204140.23306: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 15980 1727204140.23383: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 15980 1727204140.23452: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8824dde900> <<< 15980 1727204140.23687: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8824dcb260> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 15980 1727204140.23746: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.23796: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 15980 1727204140.23826: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.23910: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.24480: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 15980 1727204140.24510: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.24579: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.24662: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 15980 1727204140.24667: stdout chunk (state=3): >>> <<< 15980 1727204140.24687: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.24770: stdout chunk (state=3): >>># zipimport: zlib available<<< 15980 1727204140.24781: stdout chunk (state=3): >>> <<< 15980 1727204140.24862: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py<<< 15980 1727204140.24886: stdout chunk (state=3): >>> <<< 15980 1727204140.24898: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 15980 1727204140.24953: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so'<<< 15980 1727204140.25065: stdout chunk (state=3): >>> import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8824df9f40> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8824dcb8c0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 15980 1727204140.25149: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # <<< 15980 1727204140.25181: stdout chunk (state=3): >>> # zipimport: zlib available<<< 15980 1727204140.25196: stdout chunk (state=3): >>> <<< 15980 1727204140.25485: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.25833: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 15980 1727204140.26102: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.26160: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.26219: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.26279: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 15980 1727204140.26302: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available <<< 15980 1727204140.26345: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15980 1727204140.26577: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.26838: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 15980 1727204140.27056: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.27254: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 15980 1727204140.27306: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15980 1727204140.27359: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.28384: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.29341: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 15980 1727204140.29700: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available <<< 15980 1727204140.29819: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.29984: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 15980 1727204140.30009: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.30259: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.30526: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 15980 1727204140.30529: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.30560: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available <<< 15980 1727204140.30613: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.30689: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 15980 1727204140.30958: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.30997: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.31374: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.31756: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 15980 1727204140.31760: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.31794: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.31856: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 15980 1727204140.31864: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.31916: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.31920: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 15980 1727204140.31932: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.32032: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.32136: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 15980 1727204140.32182: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15980 1727204140.32230: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # <<< 15980 1727204140.32234: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.32406: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # <<< 15980 1727204140.32410: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.32488: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.32587: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 15980 1727204140.32591: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.33068: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.33556: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available <<< 15980 1727204140.33619: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.33703: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 15980 1727204140.33765: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15980 1727204140.33821: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 15980 1727204140.33824: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.33870: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.33906: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 15980 1727204140.33957: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15980 1727204140.34020: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 15980 1727204140.34031: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.34144: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.34270: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 15980 1727204140.34325: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15980 1727204140.34328: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 15980 1727204140.34474: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available <<< 15980 1727204140.34576: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.34591: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.34637: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.34818: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.34967: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 15980 1727204140.34970: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.35047: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 15980 1727204140.35489: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.35755: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available <<< 15980 1727204140.35833: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.35900: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 15980 1727204140.35915: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.36170: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.36192: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 15980 1727204140.36257: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.36317: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 15980 1727204140.36331: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.36468: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.36614: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 15980 1727204140.36723: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204140.37540: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 15980 1727204140.37580: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 15980 1727204140.37620: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8824c23350> <<< 15980 1727204140.37715: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8824c221b0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8824d44590> <<< 15980 1727204140.59673: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 15980 1727204140.59716: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' <<< 15980 1727204140.59720: stdout chunk (state=3): >>>import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8824c68c50> <<< 15980 1727204140.59723: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py <<< 15980 1727204140.59745: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' <<< 15980 1727204140.59752: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8824c69c40> <<< 15980 1727204140.59846: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py <<< 15980 1727204140.59850: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 15980 1727204140.59852: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py <<< 15980 1727204140.59883: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' <<< 15980 1727204140.59890: stdout chunk (state=3): >>>import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8824dd40e0> <<< 15980 1727204140.59903: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8824c6bad0> <<< 15980 1727204140.60149: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 15980 1727204140.80601: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_is_chroot": false, "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_local": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "55", "second": "40", "epoch": "1727204140", "epoch_int": "1727204140", "date": "2024-09-24", "time": "14:55:40", "iso8601_micro": "2024-09-24T18:55:40.388052Z", "iso8601": "2024-09-24T18:55:40Z", "iso8601_basic": "20240924T145540388052", "iso8601_basic_short": "20240924T145540", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_fips": false, "ansible_fibre_channel_wwn": [], "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_pkg_mgr": "dnf", "ansible_loadavg": {"1m": 0.70458984375, "5m": 0.49462890625, "15m": 0.24169921875}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_iscsi_iqn": "", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::f7:13ff:fe22:8fc1", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.73"], "ansible_all_ipv6_addresses": ["fe80::f7:13ff:fe22:8fc1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.73", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::f7:13ff:fe22:8fc1"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3041, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 675, "free": 3041}, "nocache": {"free": 3470, "used": 246}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_uuid": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 486, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251325718528, "block_size": 4096, "block_total": 64479564, "block_available": 61358818, "block_used": 3120746, "inode_total": 16384000, "inode_available": 16301507, "inode_used": 82493, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15980 1727204140.81003: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 15980 1727204140.81218: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct <<< 15980 1727204140.81225: stdout chunk (state=3): >>># cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd <<< 15980 1727204140.81293: stdout chunk (state=3): >>># cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro <<< 15980 1727204140.81355: stdout chunk (state=3): >>># cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system<<< 15980 1727204140.81359: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns <<< 15980 1727204140.81497: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios <<< 15980 1727204140.81501: stdout chunk (state=3): >>># cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 15980 1727204140.81763: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 15980 1727204140.81796: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 15980 1727204140.81818: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 15980 1727204140.81960: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport <<< 15980 1727204140.81964: stdout chunk (state=3): >>># destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib <<< 15980 1727204140.82105: stdout chunk (state=3): >>># destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro <<< 15980 1727204140.82112: stdout chunk (state=3): >>># destroy argparse # destroy logging <<< 15980 1727204140.82160: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 <<< 15980 1727204140.82485: stdout chunk (state=3): >>># destroy _ssl <<< 15980 1727204140.82491: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json <<< 15980 1727204140.82516: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 15980 1727204140.82533: stdout chunk (state=3): >>># destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 15980 1727204140.82713: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 15980 1727204140.82736: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 15980 1727204140.82767: stdout chunk (state=3): >>># destroy _typing <<< 15980 1727204140.82790: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 15980 1727204140.83027: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 15980 1727204140.83043: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 15980 1727204140.83691: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 15980 1727204140.83815: stderr chunk (state=3): >>><<< 15980 1727204140.83818: stdout chunk (state=3): >>><<< 15980 1727204140.84413: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825f18530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825ee7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825f1aab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825d2d190> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825d2e090> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825d6be90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825d6bf50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825da3890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825da3f20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825d83b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825d81280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825d69040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825dc7800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825dc6420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825d82150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825dc4c80> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825df8890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825d682f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8825df8d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825df8bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8825df8fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825d66e10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825df96a0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825df9370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825dfa5a0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825e147d0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8825e15f10> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825e16db0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8825e17410> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825e16300> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8825e17e90> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825e175c0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825dfa600> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8825b57e00> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8825b808f0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825b80650> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8825b80920> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8825b80b00> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825b55fa0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825b82210> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825b80e90> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825dfacf0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825baa5d0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825bc6720> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825bfb4d0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825c21c70> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825bfb5f0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825bc73b0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825bf8ce0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825bc5760> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825b83170> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f8825bc5880> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_8_3me01u/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825a6e2d0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825a451c0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825a44380> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825a472c0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8825aa1c10> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825aa19a0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825aa12b0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825aa1a30> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825a6ed50> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8825aa28d0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8825aa2b10> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825aa2ff0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825900e00> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8825902540> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88259032c0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88259041d0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825906f30> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8825907080> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825905220> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f882590af30> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825909a00> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825909760> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f882590be60> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825905730> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f882594f050> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f882594f1a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8825954dd0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825954b90> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f88259572f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825955460> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f882595eab0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825957440> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f882595f8f0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f882595f740> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f882595fbc0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f882594f4a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8825963410> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8825964350> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825961bb0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8825962f60> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88259617c0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f88257e85f0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88257e9a00> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88259642f0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88257e9b80> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88257ebe30> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f88257f1ee0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f88257f2840> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88257eae70> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f88257f1550> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88257f2a20> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f882588acf0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88257fca40> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88257faab0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88257fa900> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825891910> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8824d44440> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8824d447a0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f882586d4c0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f882586ca70> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88258900b0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8825893920> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8824d47800> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8824d470b0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8824d47290> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8824d464e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8824d478f0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8824dae3f0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8824dac410> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88258910a0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8824dae7b0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8824daf3e0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8824dde900> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8824dcb260> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8824df9f40> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8824dcb8c0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8824c23350> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8824c221b0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8824d44590> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8824c68c50> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8824c69c40> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8824dd40e0> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8824c6bad0> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_is_chroot": false, "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_local": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "55", "second": "40", "epoch": "1727204140", "epoch_int": "1727204140", "date": "2024-09-24", "time": "14:55:40", "iso8601_micro": "2024-09-24T18:55:40.388052Z", "iso8601": "2024-09-24T18:55:40Z", "iso8601_basic": "20240924T145540388052", "iso8601_basic_short": "20240924T145540", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_fips": false, "ansible_fibre_channel_wwn": [], "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_pkg_mgr": "dnf", "ansible_loadavg": {"1m": 0.70458984375, "5m": 0.49462890625, "15m": 0.24169921875}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_iscsi_iqn": "", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::f7:13ff:fe22:8fc1", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.73"], "ansible_all_ipv6_addresses": ["fe80::f7:13ff:fe22:8fc1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.73", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::f7:13ff:fe22:8fc1"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3041, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 675, "free": 3041}, "nocache": {"free": 3470, "used": 246}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_uuid": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 486, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251325718528, "block_size": 4096, "block_total": 64479564, "block_available": 61358818, "block_used": 3120746, "inode_total": 16384000, "inode_available": 16301507, "inode_used": 82493, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed-node2 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 15980 1727204140.90628: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204138.6994796-16302-2174381847922/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15980 1727204140.90779: _low_level_execute_command(): starting 15980 1727204140.90783: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204138.6994796-16302-2174381847922/ > /dev/null 2>&1 && sleep 0' 15980 1727204140.92090: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204140.92381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204140.92386: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204140.92496: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204140.94514: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204140.94783: stderr chunk (state=3): >>><<< 15980 1727204140.94787: stdout chunk (state=3): >>><<< 15980 1727204140.94880: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204140.94884: handler run complete 15980 1727204140.95099: variable 'ansible_facts' from source: unknown 15980 1727204140.95295: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204140.96031: variable 'ansible_facts' from source: unknown 15980 1727204140.96392: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204140.96577: attempt loop complete, returning result 15980 1727204140.96582: _execute() done 15980 1727204140.96584: dumping result to json 15980 1727204140.96617: done dumping result, returning 15980 1727204140.96628: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [127b8e07-fff9-5f1d-4b72-00000000007e] 15980 1727204140.96636: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000007e 15980 1727204140.97187: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000007e 15980 1727204140.97191: WORKER PROCESS EXITING ok: [managed-node2] 15980 1727204140.98210: no more pending results, returning what we have 15980 1727204140.98215: results queue empty 15980 1727204140.98216: checking for any_errors_fatal 15980 1727204140.98218: done checking for any_errors_fatal 15980 1727204140.98218: checking for max_fail_percentage 15980 1727204140.98220: done checking for max_fail_percentage 15980 1727204140.98221: checking to see if all hosts have failed and the running result is not ok 15980 1727204140.98222: done checking to see if all hosts have failed 15980 1727204140.98223: getting the remaining hosts for this loop 15980 1727204140.98224: done getting the remaining hosts for this loop 15980 1727204140.98229: getting the next task for host managed-node2 15980 1727204140.98236: done getting next task for host managed-node2 15980 1727204140.98238: ^ task is: TASK: meta (flush_handlers) 15980 1727204140.98240: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204140.98244: getting variables 15980 1727204140.98246: in VariableManager get_vars() 15980 1727204140.98274: Calling all_inventory to load vars for managed-node2 15980 1727204140.98277: Calling groups_inventory to load vars for managed-node2 15980 1727204140.98281: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204140.98292: Calling all_plugins_play to load vars for managed-node2 15980 1727204140.98295: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204140.98299: Calling groups_plugins_play to load vars for managed-node2 15980 1727204140.99238: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204141.00155: done with get_vars() 15980 1727204141.00485: done getting variables 15980 1727204141.00564: in VariableManager get_vars() 15980 1727204141.00580: Calling all_inventory to load vars for managed-node2 15980 1727204141.00582: Calling groups_inventory to load vars for managed-node2 15980 1727204141.00585: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204141.00591: Calling all_plugins_play to load vars for managed-node2 15980 1727204141.00594: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204141.00597: Calling groups_plugins_play to load vars for managed-node2 15980 1727204141.00950: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204141.01334: done with get_vars() 15980 1727204141.01351: done queuing things up, now waiting for results queue to drain 15980 1727204141.01354: results queue empty 15980 1727204141.01355: checking for any_errors_fatal 15980 1727204141.01358: done checking for any_errors_fatal 15980 1727204141.01359: checking for max_fail_percentage 15980 1727204141.01360: done checking for max_fail_percentage 15980 1727204141.01361: checking to see if all hosts have failed and the running result is not ok 15980 1727204141.01361: done checking to see if all hosts have failed 15980 1727204141.01370: getting the remaining hosts for this loop 15980 1727204141.01371: done getting the remaining hosts for this loop 15980 1727204141.01374: getting the next task for host managed-node2 15980 1727204141.01379: done getting next task for host managed-node2 15980 1727204141.01382: ^ task is: TASK: Include the task 'el_repo_setup.yml' 15980 1727204141.01384: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204141.01386: getting variables 15980 1727204141.01387: in VariableManager get_vars() 15980 1727204141.01397: Calling all_inventory to load vars for managed-node2 15980 1727204141.01399: Calling groups_inventory to load vars for managed-node2 15980 1727204141.01402: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204141.01407: Calling all_plugins_play to load vars for managed-node2 15980 1727204141.01410: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204141.01413: Calling groups_plugins_play to load vars for managed-node2 15980 1727204141.01886: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204141.02255: done with get_vars() 15980 1727204141.02268: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml:11 Tuesday 24 September 2024 14:55:41 -0400 (0:00:02.405) 0:00:02.433 ***** 15980 1727204141.02357: entering _queue_task() for managed-node2/include_tasks 15980 1727204141.02359: Creating lock for include_tasks 15980 1727204141.02961: worker is 1 (out of 1 available) 15980 1727204141.02975: exiting _queue_task() for managed-node2/include_tasks 15980 1727204141.02986: done queuing things up, now waiting for results queue to drain 15980 1727204141.02988: waiting for pending results... 15980 1727204141.03398: running TaskExecutor() for managed-node2/TASK: Include the task 'el_repo_setup.yml' 15980 1727204141.03404: in run() - task 127b8e07-fff9-5f1d-4b72-000000000006 15980 1727204141.03408: variable 'ansible_search_path' from source: unknown 15980 1727204141.03410: calling self._execute() 15980 1727204141.03413: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204141.03416: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204141.03419: variable 'omit' from source: magic vars 15980 1727204141.03495: _execute() done 15980 1727204141.03712: dumping result to json 15980 1727204141.03716: done dumping result, returning 15980 1727204141.03719: done running TaskExecutor() for managed-node2/TASK: Include the task 'el_repo_setup.yml' [127b8e07-fff9-5f1d-4b72-000000000006] 15980 1727204141.03721: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000006 15980 1727204141.03808: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000006 15980 1727204141.04076: WORKER PROCESS EXITING 15980 1727204141.04119: no more pending results, returning what we have 15980 1727204141.04124: in VariableManager get_vars() 15980 1727204141.04155: Calling all_inventory to load vars for managed-node2 15980 1727204141.04158: Calling groups_inventory to load vars for managed-node2 15980 1727204141.04161: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204141.04175: Calling all_plugins_play to load vars for managed-node2 15980 1727204141.04178: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204141.04186: Calling groups_plugins_play to load vars for managed-node2 15980 1727204141.04591: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204141.05059: done with get_vars() 15980 1727204141.05178: variable 'ansible_search_path' from source: unknown 15980 1727204141.05197: we have included files to process 15980 1727204141.05198: generating all_blocks data 15980 1727204141.05200: done generating all_blocks data 15980 1727204141.05201: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 15980 1727204141.05203: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 15980 1727204141.05206: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 15980 1727204141.06101: in VariableManager get_vars() 15980 1727204141.06119: done with get_vars() 15980 1727204141.06132: done processing included file 15980 1727204141.06134: iterating over new_blocks loaded from include file 15980 1727204141.06136: in VariableManager get_vars() 15980 1727204141.06152: done with get_vars() 15980 1727204141.06154: filtering new block on tags 15980 1727204141.06173: done filtering new block on tags 15980 1727204141.06177: in VariableManager get_vars() 15980 1727204141.06189: done with get_vars() 15980 1727204141.06190: filtering new block on tags 15980 1727204141.06208: done filtering new block on tags 15980 1727204141.06210: in VariableManager get_vars() 15980 1727204141.06220: done with get_vars() 15980 1727204141.06222: filtering new block on tags 15980 1727204141.06235: done filtering new block on tags 15980 1727204141.06237: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed-node2 15980 1727204141.06244: extending task lists for all hosts with included blocks 15980 1727204141.06306: done extending task lists 15980 1727204141.06308: done processing included files 15980 1727204141.06308: results queue empty 15980 1727204141.06309: checking for any_errors_fatal 15980 1727204141.06311: done checking for any_errors_fatal 15980 1727204141.06311: checking for max_fail_percentage 15980 1727204141.06312: done checking for max_fail_percentage 15980 1727204141.06313: checking to see if all hosts have failed and the running result is not ok 15980 1727204141.06314: done checking to see if all hosts have failed 15980 1727204141.06315: getting the remaining hosts for this loop 15980 1727204141.06316: done getting the remaining hosts for this loop 15980 1727204141.06318: getting the next task for host managed-node2 15980 1727204141.06322: done getting next task for host managed-node2 15980 1727204141.06324: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 15980 1727204141.06327: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204141.06328: getting variables 15980 1727204141.06329: in VariableManager get_vars() 15980 1727204141.06339: Calling all_inventory to load vars for managed-node2 15980 1727204141.06341: Calling groups_inventory to load vars for managed-node2 15980 1727204141.06344: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204141.06349: Calling all_plugins_play to load vars for managed-node2 15980 1727204141.06352: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204141.06355: Calling groups_plugins_play to load vars for managed-node2 15980 1727204141.06531: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204141.06733: done with get_vars() 15980 1727204141.06743: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Tuesday 24 September 2024 14:55:41 -0400 (0:00:00.044) 0:00:02.478 ***** 15980 1727204141.06822: entering _queue_task() for managed-node2/setup 15980 1727204141.07273: worker is 1 (out of 1 available) 15980 1727204141.07283: exiting _queue_task() for managed-node2/setup 15980 1727204141.07294: done queuing things up, now waiting for results queue to drain 15980 1727204141.07296: waiting for pending results... 15980 1727204141.07484: running TaskExecutor() for managed-node2/TASK: Gather the minimum subset of ansible_facts required by the network role test 15980 1727204141.07609: in run() - task 127b8e07-fff9-5f1d-4b72-00000000008f 15980 1727204141.07633: variable 'ansible_search_path' from source: unknown 15980 1727204141.07669: variable 'ansible_search_path' from source: unknown 15980 1727204141.07692: calling self._execute() 15980 1727204141.07780: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204141.07792: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204141.07805: variable 'omit' from source: magic vars 15980 1727204141.08425: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15980 1727204141.10741: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15980 1727204141.10832: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15980 1727204141.10918: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15980 1727204141.10972: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15980 1727204141.11025: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15980 1727204141.11134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204141.11276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204141.11279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204141.11282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204141.11285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204141.11492: variable 'ansible_facts' from source: unknown 15980 1727204141.11578: variable 'network_test_required_facts' from source: task vars 15980 1727204141.11631: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 15980 1727204141.11645: variable 'omit' from source: magic vars 15980 1727204141.11694: variable 'omit' from source: magic vars 15980 1727204141.11741: variable 'omit' from source: magic vars 15980 1727204141.12084: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204141.12088: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204141.12091: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204141.12094: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204141.12097: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204141.12109: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204141.12116: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204141.12122: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204141.12411: Set connection var ansible_connection to ssh 15980 1727204141.12414: Set connection var ansible_pipelining to False 15980 1727204141.12417: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204141.12419: Set connection var ansible_timeout to 10 15980 1727204141.12422: Set connection var ansible_shell_type to sh 15980 1727204141.12424: Set connection var ansible_shell_executable to /bin/sh 15980 1727204141.12570: variable 'ansible_shell_executable' from source: unknown 15980 1727204141.12573: variable 'ansible_connection' from source: unknown 15980 1727204141.12576: variable 'ansible_module_compression' from source: unknown 15980 1727204141.12579: variable 'ansible_shell_type' from source: unknown 15980 1727204141.12581: variable 'ansible_shell_executable' from source: unknown 15980 1727204141.12584: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204141.12587: variable 'ansible_pipelining' from source: unknown 15980 1727204141.12590: variable 'ansible_timeout' from source: unknown 15980 1727204141.12592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204141.12989: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15980 1727204141.13061: variable 'omit' from source: magic vars 15980 1727204141.13067: starting attempt loop 15980 1727204141.13070: running the handler 15980 1727204141.13072: _low_level_execute_command(): starting 15980 1727204141.13075: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15980 1727204141.14703: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204141.14727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204141.14797: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204141.14887: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204141.14906: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204141.15048: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15980 1727204141.17483: stdout chunk (state=3): >>>/root <<< 15980 1727204141.17806: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204141.17883: stderr chunk (state=3): >>><<< 15980 1727204141.18115: stdout chunk (state=3): >>><<< 15980 1727204141.18120: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 15980 1727204141.18131: _low_level_execute_command(): starting 15980 1727204141.18135: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204141.1801286-16446-187319638351374 `" && echo ansible-tmp-1727204141.1801286-16446-187319638351374="` echo /root/.ansible/tmp/ansible-tmp-1727204141.1801286-16446-187319638351374 `" ) && sleep 0' 15980 1727204141.20175: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204141.20191: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204141.20379: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15980 1727204141.23109: stdout chunk (state=3): >>>ansible-tmp-1727204141.1801286-16446-187319638351374=/root/.ansible/tmp/ansible-tmp-1727204141.1801286-16446-187319638351374 <<< 15980 1727204141.23330: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204141.23432: stderr chunk (state=3): >>><<< 15980 1727204141.23443: stdout chunk (state=3): >>><<< 15980 1727204141.23474: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204141.1801286-16446-187319638351374=/root/.ansible/tmp/ansible-tmp-1727204141.1801286-16446-187319638351374 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 15980 1727204141.23747: variable 'ansible_module_compression' from source: unknown 15980 1727204141.23771: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15980vtkmnsww/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15980 1727204141.23842: variable 'ansible_facts' from source: unknown 15980 1727204141.24408: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204141.1801286-16446-187319638351374/AnsiballZ_setup.py 15980 1727204141.24755: Sending initial data 15980 1727204141.24759: Sent initial data (154 bytes) 15980 1727204141.26188: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204141.26362: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204141.26385: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204141.26555: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15980 1727204141.28772: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15980 1727204141.28887: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15980 1727204141.29010: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15980vtkmnsww/tmpoqk1lqie /root/.ansible/tmp/ansible-tmp-1727204141.1801286-16446-187319638351374/AnsiballZ_setup.py <<< 15980 1727204141.29013: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204141.1801286-16446-187319638351374/AnsiballZ_setup.py" <<< 15980 1727204141.29161: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15980vtkmnsww/tmpoqk1lqie" to remote "/root/.ansible/tmp/ansible-tmp-1727204141.1801286-16446-187319638351374/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204141.1801286-16446-187319638351374/AnsiballZ_setup.py" <<< 15980 1727204141.32376: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204141.32381: stderr chunk (state=3): >>><<< 15980 1727204141.32384: stdout chunk (state=3): >>><<< 15980 1727204141.32386: done transferring module to remote 15980 1727204141.32389: _low_level_execute_command(): starting 15980 1727204141.32392: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204141.1801286-16446-187319638351374/ /root/.ansible/tmp/ansible-tmp-1727204141.1801286-16446-187319638351374/AnsiballZ_setup.py && sleep 0' 15980 1727204141.33853: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204141.33859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204141.33892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204141.34351: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204141.36398: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204141.36484: stderr chunk (state=3): >>><<< 15980 1727204141.36488: stdout chunk (state=3): >>><<< 15980 1727204141.36503: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204141.36510: _low_level_execute_command(): starting 15980 1727204141.36518: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204141.1801286-16446-187319638351374/AnsiballZ_setup.py && sleep 0' 15980 1727204141.38385: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204141.38390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204141.38572: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204141.38588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204141.38636: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204141.38983: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204141.39106: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204141.41993: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 15980 1727204141.42036: stdout chunk (state=3): >>>import '_io' # <<< 15980 1727204141.42131: stdout chunk (state=3): >>>import 'marshal' # <<< 15980 1727204141.42156: stdout chunk (state=3): >>>import 'posix' # <<< 15980 1727204141.42189: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 15980 1727204141.42228: stdout chunk (state=3): >>>import 'time' # <<< 15980 1727204141.42363: stdout chunk (state=3): >>> import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 15980 1727204141.42375: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 15980 1727204141.42408: stdout chunk (state=3): >>>import '_codecs' # <<< 15980 1727204141.42427: stdout chunk (state=3): >>> <<< 15980 1727204141.42456: stdout chunk (state=3): >>>import 'codecs' # <<< 15980 1727204141.42481: stdout chunk (state=3): >>> <<< 15980 1727204141.42563: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 15980 1727204141.42589: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abffc530><<< 15980 1727204141.42669: stdout chunk (state=3): >>> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abfcbb30><<< 15980 1727204141.42693: stdout chunk (state=3): >>> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abffeab0><<< 15980 1727204141.42728: stdout chunk (state=3): >>> import '_signal' # <<< 15980 1727204141.42738: stdout chunk (state=3): >>> <<< 15980 1727204141.42774: stdout chunk (state=3): >>>import '_abc' # <<< 15980 1727204141.42816: stdout chunk (state=3): >>> import 'abc' # <<< 15980 1727204141.42839: stdout chunk (state=3): >>> import 'io' # <<< 15980 1727204141.42897: stdout chunk (state=3): >>>import '_stat' # <<< 15980 1727204141.43026: stdout chunk (state=3): >>> import 'stat' # <<< 15980 1727204141.43059: stdout chunk (state=3): >>> import '_collections_abc' # <<< 15980 1727204141.43082: stdout chunk (state=3): >>> <<< 15980 1727204141.43121: stdout chunk (state=3): >>>import 'genericpath' # <<< 15980 1727204141.43157: stdout chunk (state=3): >>>import 'posixpath' # <<< 15980 1727204141.43257: stdout chunk (state=3): >>> <<< 15980 1727204141.43537: stdout chunk (state=3): >>>import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abe111c0><<< 15980 1727204141.43545: stdout chunk (state=3): >>> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abe120c0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 15980 1727204141.43936: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 15980 1727204141.43965: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 15980 1727204141.43987: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 15980 1727204141.44033: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 15980 1727204141.44049: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 15980 1727204141.44075: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 15980 1727204141.44101: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abe4ffe0> <<< 15980 1727204141.44126: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 15980 1727204141.44195: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abe64170> <<< 15980 1727204141.44206: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 15980 1727204141.44237: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 15980 1727204141.44285: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 15980 1727204141.44306: stdout chunk (state=3): >>>import 'itertools' # <<< 15980 1727204141.44368: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' <<< 15980 1727204141.44461: stdout chunk (state=3): >>>import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abe879b0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 15980 1727204141.44465: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abe87f80> <<< 15980 1727204141.44469: stdout chunk (state=3): >>>import '_collections' # <<< 15980 1727204141.44490: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abe67c50> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abe653d0> <<< 15980 1727204141.44595: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abe4d190> <<< 15980 1727204141.44615: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 15980 1727204141.44650: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 15980 1727204141.44653: stdout chunk (state=3): >>>import '_sre' # <<< 15980 1727204141.44719: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 15980 1727204141.44723: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 15980 1727204141.44756: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 15980 1727204141.44759: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 15980 1727204141.44821: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abeab980> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abeaa5a0> <<< 15980 1727204141.44824: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abe662a0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abea8d70> <<< 15980 1727204141.44921: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abed8a10> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abe4c410> <<< 15980 1727204141.44995: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 15980 1727204141.45018: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3abed8ec0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abed8d70> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3abed9160> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abe4af30> <<< 15980 1727204141.45053: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 15980 1727204141.45080: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 15980 1727204141.45214: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 15980 1727204141.45220: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abed97f0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abed94f0> import 'importlib.machinery' # <<< 15980 1727204141.45250: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abeda6f0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 15980 1727204141.45274: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 15980 1727204141.45305: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 15980 1727204141.45327: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abef48f0> import 'errno' # <<< 15980 1727204141.45364: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 15980 1727204141.45439: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3abef6030> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 15980 1727204141.45442: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 15980 1727204141.45484: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abef6ed0> <<< 15980 1727204141.45588: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3abef7500> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abef6420> <<< 15980 1727204141.45591: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 15980 1727204141.45596: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3abef7ec0> <<< 15980 1727204141.45606: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abef75f0> <<< 15980 1727204141.45699: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abeda660> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 15980 1727204141.45790: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 15980 1727204141.45803: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3abc4fd40> <<< 15980 1727204141.45826: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 15980 1727204141.45860: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3abc78860> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abc785c0> <<< 15980 1727204141.46095: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3abc78890> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3abc78a70> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abc4dee0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 15980 1727204141.46103: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 15980 1727204141.46229: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 15980 1727204141.46247: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abc7a120> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abc78da0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abedae10> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 15980 1727204141.46306: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 15980 1727204141.46328: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 15980 1727204141.46430: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abca24e0> <<< 15980 1727204141.46507: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 15980 1727204141.46576: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 15980 1727204141.46594: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 15980 1727204141.46646: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abcbe660> <<< 15980 1727204141.46767: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 15980 1727204141.46791: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 15980 1727204141.46911: stdout chunk (state=3): >>>import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abcf3410> <<< 15980 1727204141.46937: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 15980 1727204141.46962: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 15980 1727204141.47031: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 15980 1727204141.47177: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abd1dbb0> <<< 15980 1727204141.47286: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abcf3530> <<< 15980 1727204141.47359: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abcbed20> <<< 15980 1727204141.47414: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abb10590> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abcbd6a0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abc7b080> <<< 15980 1727204141.47690: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 15980 1727204141.47749: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fa3abcbd430> <<< 15980 1727204141.48017: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_jvt17khg/ansible_setup_payload.zip' # zipimport: zlib available <<< 15980 1727204141.48544: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 15980 1727204141.48548: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abb7e2d0> import '_typing' # <<< 15980 1727204141.48947: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abb551c0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abb54320> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available <<< 15980 1727204141.48954: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils' # <<< 15980 1727204141.48975: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.50680: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.51944: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abb578f0> <<< 15980 1727204141.51984: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 15980 1727204141.52014: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 15980 1727204141.52053: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3abbadc10> <<< 15980 1727204141.52097: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abbada30> <<< 15980 1727204141.52136: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abbad370> <<< 15980 1727204141.52201: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 15980 1727204141.52263: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abbadc70> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abb7ef60> import 'atexit' # <<< 15980 1727204141.52315: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3abbae960> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3abbaeb70> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 15980 1727204141.52383: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 15980 1727204141.52387: stdout chunk (state=3): >>>import '_locale' # <<< 15980 1727204141.52484: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abbaef90> import 'pwd' # <<< 15980 1727204141.52508: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 15980 1727204141.52525: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aba14dd0> <<< 15980 1727204141.52557: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3aba169f0> <<< 15980 1727204141.52591: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 15980 1727204141.52701: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aba173b0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 15980 1727204141.52722: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aba18560> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 15980 1727204141.52744: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 15980 1727204141.52773: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 15980 1727204141.53016: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aba1b080> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3aba1b1a0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aba19340> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 15980 1727204141.53023: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 15980 1727204141.53075: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 15980 1727204141.53078: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aba1ef60> <<< 15980 1727204141.53083: stdout chunk (state=3): >>>import '_tokenize' # <<< 15980 1727204141.53142: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aba1da30> <<< 15980 1727204141.53168: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aba1d790> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 15980 1727204141.53184: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 15980 1727204141.53254: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aba1ffe0> <<< 15980 1727204141.53283: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aba19850> <<< 15980 1727204141.53443: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3aba630b0> <<< 15980 1727204141.53469: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aba63200> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3aba68e30> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aba68bf0> <<< 15980 1727204141.53487: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 15980 1727204141.53603: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 15980 1727204141.53658: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3aba6b2c0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aba69460> <<< 15980 1727204141.53685: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 15980 1727204141.53724: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 15980 1727204141.53752: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 15980 1727204141.53778: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 15980 1727204141.53817: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aba72a20> <<< 15980 1727204141.53954: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aba6b3b0> <<< 15980 1727204141.54033: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3aba73800> <<< 15980 1727204141.54068: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3aba73a10> <<< 15980 1727204141.54117: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3aba73da0> <<< 15980 1727204141.54220: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aba63530> <<< 15980 1727204141.54223: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 15980 1727204141.54241: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 15980 1727204141.54263: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3aba77440> <<< 15980 1727204141.54520: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3aba78740> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aba75be0> <<< 15980 1727204141.54557: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3aba76f90> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aba758e0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 15980 1727204141.54647: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.54752: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.54850: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 15980 1727204141.54860: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.54951: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.55084: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.55704: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.56382: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 15980 1727204141.56417: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 15980 1727204141.56429: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3ab900800> <<< 15980 1727204141.56709: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 15980 1727204141.56712: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3ab901580> <<< 15980 1727204141.56724: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aba7bc80> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 15980 1727204141.56817: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.56988: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 15980 1727204141.57054: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3ab9016d0> # zipimport: zlib available <<< 15980 1727204141.57904: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.58047: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.58121: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.58203: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available <<< 15980 1727204141.58247: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.58287: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 15980 1727204141.58300: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.58421: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.58475: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 15980 1727204141.58511: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 15980 1727204141.58515: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.58561: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.58616: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 15980 1727204141.58627: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.58877: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.59225: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 15980 1727204141.59230: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # <<< 15980 1727204141.59278: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3ab9024b0> <<< 15980 1727204141.59301: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.59461: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.59468: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 15980 1727204141.59493: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 15980 1727204141.59634: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 15980 1727204141.59761: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3ab90a1b0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 15980 1727204141.60121: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3ab90aa80> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3ab903350> <<< 15980 1727204141.60125: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 15980 1727204141.60128: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 15980 1727204141.60163: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 15980 1727204141.60333: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3ab909970> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3ab90acc0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 15980 1727204141.60347: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.60411: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.60479: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.60508: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.60548: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 15980 1727204141.60572: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 15980 1727204141.60590: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 15980 1727204141.60641: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 15980 1727204141.60653: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 15980 1727204141.60688: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 15980 1727204141.60705: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 15980 1727204141.60757: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 15980 1727204141.60782: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3ab99ee10> <<< 15980 1727204141.60834: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3ab914b60> <<< 15980 1727204141.60921: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3ab912c90> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3ab912ae0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 15980 1727204141.60934: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.60990: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 15980 1727204141.61052: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 15980 1727204141.61099: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 15980 1727204141.61174: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.61235: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.61275: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.61292: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.61327: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.61370: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.61416: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.61444: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 15980 1727204141.61461: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.61543: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.61630: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.61658: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.61729: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 15980 1727204141.61757: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.61892: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.62086: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.62130: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.62209: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 15980 1727204141.62289: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 15980 1727204141.62316: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3ab9a1c70> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 15980 1727204141.62351: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 15980 1727204141.62435: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 15980 1727204141.62443: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 15980 1727204141.62599: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aaf10470> <<< 15980 1727204141.62615: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3aaf10a40> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3ab97d4f0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3ab97c6e0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3ab9a0350> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3ab9a0770> <<< 15980 1727204141.62634: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 15980 1727204141.62679: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 15980 1727204141.62713: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 15980 1727204141.62748: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 15980 1727204141.62771: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' <<< 15980 1727204141.62795: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3aaf13710> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aaf12fc0> <<< 15980 1727204141.62948: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3aaf131a0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aaf123f0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 15980 1727204141.62976: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aaf13890> <<< 15980 1727204141.62990: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 15980 1727204141.63016: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 15980 1727204141.63046: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3aaf7a390> <<< 15980 1727204141.63085: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aaf783b0> <<< 15980 1727204141.63116: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3ab9a1430> import 'ansible.module_utils.facts.timeout' # <<< 15980 1727204141.63151: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # # zipimport: zlib available <<< 15980 1727204141.63263: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available <<< 15980 1727204141.63306: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 15980 1727204141.63330: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.63384: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.63435: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 15980 1727204141.63460: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.63693: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 15980 1727204141.63719: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available <<< 15980 1727204141.63768: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 15980 1727204141.63784: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.63840: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.63909: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.63994: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.64037: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 15980 1727204141.64060: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # <<< 15980 1727204141.64078: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.64616: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.65090: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 15980 1727204141.65229: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.65255: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 15980 1727204141.65285: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available <<< 15980 1727204141.65325: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.65357: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 15980 1727204141.65374: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.65422: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.65493: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 15980 1727204141.65496: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.65529: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.65588: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 15980 1727204141.65591: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.65624: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.65697: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 15980 1727204141.65767: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.65817: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 15980 1727204141.65856: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aaf7b110> <<< 15980 1727204141.65914: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 15980 1727204141.66031: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aaf7b350> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 15980 1727204141.66103: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.66192: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 15980 1727204141.66248: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.66282: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.66422: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 15980 1727204141.66456: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.66459: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.66552: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available <<< 15980 1727204141.66625: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.66651: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 15980 1727204141.66679: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 15980 1727204141.66757: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 15980 1727204141.66818: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3aafae7b0> <<< 15980 1727204141.67185: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aaf97590> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # <<< 15980 1727204141.67188: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.67246: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.67381: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.67716: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available <<< 15980 1727204141.67724: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available <<< 15980 1727204141.67812: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 15980 1727204141.67842: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 15980 1727204141.67873: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3aadb2060> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aadb1c10> import 'ansible.module_utils.facts.system.user' # <<< 15980 1727204141.67904: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 15980 1727204141.67924: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.67957: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.68240: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 15980 1727204141.68254: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.68387: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 15980 1727204141.68464: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.68600: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.68616: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.68667: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 15980 1727204141.68695: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available <<< 15980 1727204141.68724: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15980 1727204141.68892: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.69035: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 15980 1727204141.69046: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.69177: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.69332: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available <<< 15980 1727204141.69354: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.69395: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.70319: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.70571: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 15980 1727204141.70595: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.70688: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.70802: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 15980 1727204141.70826: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.70916: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.71102: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available <<< 15980 1727204141.71205: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.71370: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 15980 1727204141.71402: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 15980 1727204141.71529: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.71543: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 15980 1727204141.71616: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.71959: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15980 1727204141.72184: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 15980 1727204141.72197: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.72230: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.72262: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 15980 1727204141.72289: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.72338: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 15980 1727204141.72411: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.72483: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 15980 1727204141.72515: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15980 1727204141.72548: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # <<< 15980 1727204141.72561: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.72619: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.72676: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 15980 1727204141.72697: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.72745: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.72809: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 15980 1727204141.72826: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.73110: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.73389: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 15980 1727204141.73472: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.73500: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.73609: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available <<< 15980 1727204141.73649: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available <<< 15980 1727204141.73669: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.73697: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 15980 1727204141.73738: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.73752: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.73809: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 15980 1727204141.73861: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.73991: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 15980 1727204141.73995: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.74033: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available <<< 15980 1727204141.74089: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available <<< 15980 1727204141.74206: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.74211: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.74238: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15980 1727204141.74316: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.74399: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # <<< 15980 1727204141.74413: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 15980 1727204141.74462: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.74549: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 15980 1727204141.74772: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.74988: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available <<< 15980 1727204141.75072: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.75110: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available <<< 15980 1727204141.75225: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 15980 1727204141.75262: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.75389: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 15980 1727204141.75441: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.75525: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.75569: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 15980 1727204141.75643: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204141.76750: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 15980 1727204141.76755: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 15980 1727204141.76773: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3aaddb7a0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aadd8ce0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aadd82c0> <<< 15980 1727204141.77108: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_r<<< 15980 1727204141.77129: stdout chunk (state=3): >>>eal_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_lsb": {}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "ansible_fips": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "55", "second": "41", "epoch": "1727204141", "epoch_int": "1727204141", "date": "2024-09-24", "time": "14:55:41", "iso8601_micro": "2024-09-24T18:55:41.769258Z", "iso8601": "2024-09-24T18:55:41Z", "iso8601_basic": "20240924T145541769258", "iso8601_basic_short": "20240924T145541", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15980 1727204141.77653: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type <<< 15980 1727204141.77735: stdout chunk (state=3): >>># clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack <<< 15980 1727204141.77739: stdout chunk (state=3): >>># destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random <<< 15980 1727204141.77793: stdout chunk (state=3): >>># cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters <<< 15980 1727204141.77972: stdout chunk (state=3): >>># cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file <<< 15980 1727204141.77977: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 15980 1727204141.78262: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 15980 1727204141.78482: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma <<< 15980 1727204141.78485: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 15980 1727204141.78488: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil <<< 15980 1727204141.78517: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 15980 1727204141.78683: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle <<< 15980 1727204141.78687: stdout chunk (state=3): >>># destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process <<< 15980 1727204141.78720: stdout chunk (state=3): >>># destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json <<< 15980 1727204141.78847: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 15980 1727204141.78961: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 15980 1727204141.79213: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize <<< 15980 1727204141.79247: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 15980 1727204141.79344: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs <<< 15980 1727204141.79506: stdout chunk (state=3): >>># destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 15980 1727204141.79911: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 15980 1727204141.79934: stderr chunk (state=3): >>><<< 15980 1727204141.79951: stdout chunk (state=3): >>><<< 15980 1727204141.80296: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abffc530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abfcbb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abffeab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abe111c0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abe120c0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abe4ffe0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abe64170> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abe879b0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abe87f80> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abe67c50> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abe653d0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abe4d190> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abeab980> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abeaa5a0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abe662a0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abea8d70> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abed8a10> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abe4c410> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3abed8ec0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abed8d70> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3abed9160> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abe4af30> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abed97f0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abed94f0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abeda6f0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abef48f0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3abef6030> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abef6ed0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3abef7500> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abef6420> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3abef7ec0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abef75f0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abeda660> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3abc4fd40> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3abc78860> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abc785c0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3abc78890> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3abc78a70> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abc4dee0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abc7a120> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abc78da0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abedae10> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abca24e0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abcbe660> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abcf3410> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abd1dbb0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abcf3530> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abcbed20> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abb10590> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abcbd6a0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abc7b080> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fa3abcbd430> # zipimport: found 103 names in '/tmp/ansible_setup_payload_jvt17khg/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abb7e2d0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abb551c0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abb54320> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abb578f0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3abbadc10> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abbada30> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abbad370> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abbadc70> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abb7ef60> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3abbae960> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3abbaeb70> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3abbaef90> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aba14dd0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3aba169f0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aba173b0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aba18560> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aba1b080> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3aba1b1a0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aba19340> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aba1ef60> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aba1da30> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aba1d790> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aba1ffe0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aba19850> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3aba630b0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aba63200> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3aba68e30> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aba68bf0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3aba6b2c0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aba69460> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aba72a20> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aba6b3b0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3aba73800> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3aba73a10> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3aba73da0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aba63530> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3aba77440> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3aba78740> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aba75be0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3aba76f90> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aba758e0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3ab900800> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3ab901580> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aba7bc80> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3ab9016d0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3ab9024b0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3ab90a1b0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3ab90aa80> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3ab903350> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3ab909970> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3ab90acc0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3ab99ee10> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3ab914b60> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3ab912c90> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3ab912ae0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3ab9a1c70> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aaf10470> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3aaf10a40> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3ab97d4f0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3ab97c6e0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3ab9a0350> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3ab9a0770> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3aaf13710> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aaf12fc0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3aaf131a0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aaf123f0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aaf13890> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3aaf7a390> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aaf783b0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3ab9a1430> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aaf7b110> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aaf7b350> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3aafae7b0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aaf97590> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3aadb2060> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aadb1c10> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3aaddb7a0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aadd8ce0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3aadd82c0> {"ansible_facts": {"ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_lsb": {}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "ansible_fips": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "55", "second": "41", "epoch": "1727204141", "epoch_int": "1727204141", "date": "2024-09-24", "time": "14:55:41", "iso8601_micro": "2024-09-24T18:55:41.769258Z", "iso8601": "2024-09-24T18:55:41Z", "iso8601_basic": "20240924T145541769258", "iso8601_basic_short": "20240924T145541", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 15980 1727204141.83680: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204141.1801286-16446-187319638351374/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15980 1727204141.83684: _low_level_execute_command(): starting 15980 1727204141.83687: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204141.1801286-16446-187319638351374/ > /dev/null 2>&1 && sleep 0' 15980 1727204141.83992: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204141.83996: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204141.84016: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204141.84043: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204141.84301: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204141.86192: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204141.86389: stderr chunk (state=3): >>><<< 15980 1727204141.86393: stdout chunk (state=3): >>><<< 15980 1727204141.86396: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204141.86398: handler run complete 15980 1727204141.86610: variable 'ansible_facts' from source: unknown 15980 1727204141.86663: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204141.86807: variable 'ansible_facts' from source: unknown 15980 1727204141.86886: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204141.87046: attempt loop complete, returning result 15980 1727204141.87049: _execute() done 15980 1727204141.87052: dumping result to json 15980 1727204141.87054: done dumping result, returning 15980 1727204141.87056: done running TaskExecutor() for managed-node2/TASK: Gather the minimum subset of ansible_facts required by the network role test [127b8e07-fff9-5f1d-4b72-00000000008f] 15980 1727204141.87058: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000008f ok: [managed-node2] 15980 1727204141.87379: no more pending results, returning what we have 15980 1727204141.87382: results queue empty 15980 1727204141.87383: checking for any_errors_fatal 15980 1727204141.87384: done checking for any_errors_fatal 15980 1727204141.87385: checking for max_fail_percentage 15980 1727204141.87386: done checking for max_fail_percentage 15980 1727204141.87387: checking to see if all hosts have failed and the running result is not ok 15980 1727204141.87388: done checking to see if all hosts have failed 15980 1727204141.87389: getting the remaining hosts for this loop 15980 1727204141.87390: done getting the remaining hosts for this loop 15980 1727204141.87394: getting the next task for host managed-node2 15980 1727204141.87402: done getting next task for host managed-node2 15980 1727204141.87404: ^ task is: TASK: Check if system is ostree 15980 1727204141.87407: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204141.87410: getting variables 15980 1727204141.87411: in VariableManager get_vars() 15980 1727204141.87437: Calling all_inventory to load vars for managed-node2 15980 1727204141.87440: Calling groups_inventory to load vars for managed-node2 15980 1727204141.87443: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204141.87450: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000008f 15980 1727204141.87453: WORKER PROCESS EXITING 15980 1727204141.87464: Calling all_plugins_play to load vars for managed-node2 15980 1727204141.87469: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204141.87473: Calling groups_plugins_play to load vars for managed-node2 15980 1727204141.87691: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204141.87902: done with get_vars() 15980 1727204141.87921: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Tuesday 24 September 2024 14:55:41 -0400 (0:00:00.812) 0:00:03.290 ***** 15980 1727204141.88035: entering _queue_task() for managed-node2/stat 15980 1727204141.88553: worker is 1 (out of 1 available) 15980 1727204141.88687: exiting _queue_task() for managed-node2/stat 15980 1727204141.88701: done queuing things up, now waiting for results queue to drain 15980 1727204141.88703: waiting for pending results... 15980 1727204141.89188: running TaskExecutor() for managed-node2/TASK: Check if system is ostree 15980 1727204141.89431: in run() - task 127b8e07-fff9-5f1d-4b72-000000000091 15980 1727204141.89435: variable 'ansible_search_path' from source: unknown 15980 1727204141.89442: variable 'ansible_search_path' from source: unknown 15980 1727204141.89446: calling self._execute() 15980 1727204141.89574: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204141.89582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204141.89662: variable 'omit' from source: magic vars 15980 1727204141.90182: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15980 1727204141.90477: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15980 1727204141.90524: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15980 1727204141.90576: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15980 1727204141.90641: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15980 1727204141.90749: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15980 1727204141.90795: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15980 1727204141.90829: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204141.90877: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15980 1727204141.91095: Evaluated conditional (not __network_is_ostree is defined): True 15980 1727204141.91100: variable 'omit' from source: magic vars 15980 1727204141.91103: variable 'omit' from source: magic vars 15980 1727204141.91140: variable 'omit' from source: magic vars 15980 1727204141.91174: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204141.91221: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204141.91312: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204141.91316: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204141.91319: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204141.91334: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204141.91345: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204141.91421: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204141.91477: Set connection var ansible_connection to ssh 15980 1727204141.91492: Set connection var ansible_pipelining to False 15980 1727204141.91503: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204141.91514: Set connection var ansible_timeout to 10 15980 1727204141.91529: Set connection var ansible_shell_type to sh 15980 1727204141.91560: Set connection var ansible_shell_executable to /bin/sh 15980 1727204141.91618: variable 'ansible_shell_executable' from source: unknown 15980 1727204141.91671: variable 'ansible_connection' from source: unknown 15980 1727204141.91681: variable 'ansible_module_compression' from source: unknown 15980 1727204141.91725: variable 'ansible_shell_type' from source: unknown 15980 1727204141.91728: variable 'ansible_shell_executable' from source: unknown 15980 1727204141.91731: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204141.91733: variable 'ansible_pipelining' from source: unknown 15980 1727204141.91736: variable 'ansible_timeout' from source: unknown 15980 1727204141.91738: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204141.91955: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15980 1727204141.92104: variable 'omit' from source: magic vars 15980 1727204141.92107: starting attempt loop 15980 1727204141.92110: running the handler 15980 1727204141.92113: _low_level_execute_command(): starting 15980 1727204141.92116: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15980 1727204141.93224: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204141.93290: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204141.93374: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15980 1727204141.95300: stdout chunk (state=3): >>>/root <<< 15980 1727204141.95603: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204141.95608: stdout chunk (state=3): >>><<< 15980 1727204141.95612: stderr chunk (state=3): >>><<< 15980 1727204141.95615: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 15980 1727204141.95630: _low_level_execute_command(): starting 15980 1727204141.95634: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204141.954911-16485-203793281700435 `" && echo ansible-tmp-1727204141.954911-16485-203793281700435="` echo /root/.ansible/tmp/ansible-tmp-1727204141.954911-16485-203793281700435 `" ) && sleep 0' 15980 1727204141.96305: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15980 1727204141.96377: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204141.96447: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204141.96460: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204141.96593: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15980 1727204141.99373: stdout chunk (state=3): >>>ansible-tmp-1727204141.954911-16485-203793281700435=/root/.ansible/tmp/ansible-tmp-1727204141.954911-16485-203793281700435 <<< 15980 1727204141.99658: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204141.99662: stdout chunk (state=3): >>><<< 15980 1727204141.99668: stderr chunk (state=3): >>><<< 15980 1727204141.99689: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204141.954911-16485-203793281700435=/root/.ansible/tmp/ansible-tmp-1727204141.954911-16485-203793281700435 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 15980 1727204141.99831: variable 'ansible_module_compression' from source: unknown 15980 1727204141.99853: ANSIBALLZ: Using lock for stat 15980 1727204141.99861: ANSIBALLZ: Acquiring lock 15980 1727204141.99873: ANSIBALLZ: Lock acquired: 139981197613136 15980 1727204141.99882: ANSIBALLZ: Creating module 15980 1727204142.16459: ANSIBALLZ: Writing module into payload 15980 1727204142.16591: ANSIBALLZ: Writing module 15980 1727204142.16630: ANSIBALLZ: Renaming module 15980 1727204142.16672: ANSIBALLZ: Done creating module 15980 1727204142.16676: variable 'ansible_facts' from source: unknown 15980 1727204142.16771: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204141.954911-16485-203793281700435/AnsiballZ_stat.py 15980 1727204142.16997: Sending initial data 15980 1727204142.17001: Sent initial data (152 bytes) 15980 1727204142.17867: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204142.17896: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204142.17924: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204142.17955: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204142.18057: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15980 1727204142.20436: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15980 1727204142.20560: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15980 1727204142.20655: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15980vtkmnsww/tmpznpclhpy /root/.ansible/tmp/ansible-tmp-1727204141.954911-16485-203793281700435/AnsiballZ_stat.py <<< 15980 1727204142.20659: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204141.954911-16485-203793281700435/AnsiballZ_stat.py" <<< 15980 1727204142.20739: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15980vtkmnsww/tmpznpclhpy" to remote "/root/.ansible/tmp/ansible-tmp-1727204141.954911-16485-203793281700435/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204141.954911-16485-203793281700435/AnsiballZ_stat.py" <<< 15980 1727204142.21759: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204142.21805: stderr chunk (state=3): >>><<< 15980 1727204142.21876: stdout chunk (state=3): >>><<< 15980 1727204142.21880: done transferring module to remote 15980 1727204142.21882: _low_level_execute_command(): starting 15980 1727204142.21901: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204141.954911-16485-203793281700435/ /root/.ansible/tmp/ansible-tmp-1727204141.954911-16485-203793281700435/AnsiballZ_stat.py && sleep 0' 15980 1727204142.22664: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204142.22683: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204142.22696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204142.22712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15980 1727204142.22752: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 15980 1727204142.22763: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15980 1727204142.22868: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204142.22894: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204142.22907: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204142.23010: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15980 1727204142.25695: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204142.25712: stdout chunk (state=3): >>><<< 15980 1727204142.25726: stderr chunk (state=3): >>><<< 15980 1727204142.25755: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 15980 1727204142.25764: _low_level_execute_command(): starting 15980 1727204142.25779: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204141.954911-16485-203793281700435/AnsiballZ_stat.py && sleep 0' 15980 1727204142.26474: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204142.26479: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204142.26481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204142.26490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15980 1727204142.26513: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 15980 1727204142.26620: stderr chunk (state=3): >>>debug2: match not found <<< 15980 1727204142.26634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204142.26640: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15980 1727204142.26643: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 15980 1727204142.26646: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204142.26707: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204142.26796: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15980 1727204142.30195: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 15980 1727204142.30260: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 15980 1727204142.30333: stdout chunk (state=3): >>> import 'posix' # <<< 15980 1727204142.30355: stdout chunk (state=3): >>> <<< 15980 1727204142.30387: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 15980 1727204142.30391: stdout chunk (state=3): >>># installing zipimport hook <<< 15980 1727204142.30411: stdout chunk (state=3): >>>import 'time' # <<< 15980 1727204142.30448: stdout chunk (state=3): >>> import 'zipimport' # # installed zipimport hook<<< 15980 1727204142.30529: stdout chunk (state=3): >>> # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 15980 1727204142.30567: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 15980 1727204142.30590: stdout chunk (state=3): >>>import '_codecs' # <<< 15980 1727204142.30626: stdout chunk (state=3): >>>import 'codecs' # <<< 15980 1727204142.30690: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 15980 1727204142.30734: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16fff18530><<< 15980 1727204142.30750: stdout chunk (state=3): >>> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffee7b30> <<< 15980 1727204142.30798: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc'<<< 15980 1727204142.30817: stdout chunk (state=3): >>> import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16fff1aab0><<< 15980 1727204142.30853: stdout chunk (state=3): >>> import '_signal' # <<< 15980 1727204142.30905: stdout chunk (state=3): >>>import '_abc' # <<< 15980 1727204142.30913: stdout chunk (state=3): >>> <<< 15980 1727204142.30942: stdout chunk (state=3): >>>import 'abc' # import 'io' # <<< 15980 1727204142.31002: stdout chunk (state=3): >>>import '_stat' # <<< 15980 1727204142.31005: stdout chunk (state=3): >>> import 'stat' # <<< 15980 1727204142.31148: stdout chunk (state=3): >>> import '_collections_abc' # <<< 15980 1727204142.31196: stdout chunk (state=3): >>>import 'genericpath' # <<< 15980 1727204142.31216: stdout chunk (state=3): >>>import 'posixpath' # <<< 15980 1727204142.31267: stdout chunk (state=3): >>>import 'os' # <<< 15980 1727204142.31296: stdout chunk (state=3): >>> import '_sitebuiltins' # <<< 15980 1727204142.31321: stdout chunk (state=3): >>> Processing user site-packages <<< 15980 1727204142.31361: stdout chunk (state=3): >>>Processing global site-packages <<< 15980 1727204142.31388: stdout chunk (state=3): >>>Adding directory: '/usr/local/lib/python3.12/site-packages' <<< 15980 1727204142.31406: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 15980 1727204142.31433: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 15980 1727204142.31496: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py<<< 15980 1727204142.31500: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 15980 1727204142.31621: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffd2d190> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 15980 1727204142.31635: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 15980 1727204142.31662: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffd2e090><<< 15980 1727204142.31673: stdout chunk (state=3): >>> <<< 15980 1727204142.31711: stdout chunk (state=3): >>>import 'site' # <<< 15980 1727204142.31720: stdout chunk (state=3): >>> <<< 15980 1727204142.31789: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux <<< 15980 1727204142.31837: stdout chunk (state=3): >>>Type "help", "copyright", "credits" or "license" for more information. <<< 15980 1727204142.32200: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 15980 1727204142.32219: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 15980 1727204142.32265: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc'<<< 15980 1727204142.32323: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 15980 1727204142.32362: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc'<<< 15980 1727204142.32411: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 15980 1727204142.32447: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 15980 1727204142.32514: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffd6bf80> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py<<< 15980 1727204142.32539: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc'<<< 15980 1727204142.32599: stdout chunk (state=3): >>> import '_operator' # <<< 15980 1727204142.32602: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffd80110><<< 15980 1727204142.32605: stdout chunk (state=3): >>> <<< 15980 1727204142.32680: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 15980 1727204142.32684: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 15980 1727204142.32718: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py<<< 15980 1727204142.32803: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 15980 1727204142.32858: stdout chunk (state=3): >>>import 'itertools' # <<< 15980 1727204142.32903: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc'<<< 15980 1727204142.32915: stdout chunk (state=3): >>> import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffda3950> <<< 15980 1727204142.32952: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 15980 1727204142.33001: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffda3fe0><<< 15980 1727204142.33021: stdout chunk (state=3): >>> import '_collections' # <<< 15980 1727204142.33095: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffd83c20> <<< 15980 1727204142.33163: stdout chunk (state=3): >>>import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffd81370> <<< 15980 1727204142.33334: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffd69130> <<< 15980 1727204142.33382: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py<<< 15980 1727204142.33417: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 15980 1727204142.33445: stdout chunk (state=3): >>>import '_sre' # <<< 15980 1727204142.33470: stdout chunk (state=3): >>> <<< 15980 1727204142.33488: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py<<< 15980 1727204142.33524: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc'<<< 15980 1727204142.33558: stdout chunk (state=3): >>> # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py<<< 15980 1727204142.33583: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc'<<< 15980 1727204142.33631: stdout chunk (state=3): >>> import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffdc78c0> <<< 15980 1727204142.33661: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffdc64e0> <<< 15980 1727204142.33719: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py <<< 15980 1727204142.33739: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffd82210> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffdc4d70><<< 15980 1727204142.33827: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 15980 1727204142.33866: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffdf4980> <<< 15980 1727204142.33883: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffd683b0> <<< 15980 1727204142.33910: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py<<< 15980 1727204142.33928: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc'<<< 15980 1727204142.33973: stdout chunk (state=3): >>> # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 15980 1727204142.34004: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 15980 1727204142.34016: stdout chunk (state=3): >>>import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f16ffdf4e30> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffdf4ce0><<< 15980 1727204142.34063: stdout chunk (state=3): >>> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 15980 1727204142.34101: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 15980 1727204142.34116: stdout chunk (state=3): >>>import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f16ffdf50a0> <<< 15980 1727204142.34166: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffd66ed0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py<<< 15980 1727204142.34187: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 15980 1727204142.34249: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 15980 1727204142.34283: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffdf5760><<< 15980 1727204142.34312: stdout chunk (state=3): >>> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffdf5430><<< 15980 1727204142.34325: stdout chunk (state=3): >>> import 'importlib.machinery' # <<< 15980 1727204142.34355: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc'<<< 15980 1727204142.34388: stdout chunk (state=3): >>> import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffdf6660> <<< 15980 1727204142.34419: stdout chunk (state=3): >>>import 'importlib.util' # <<< 15980 1727204142.34441: stdout chunk (state=3): >>> import 'runpy' # <<< 15980 1727204142.34518: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc'<<< 15980 1727204142.34540: stdout chunk (state=3): >>> <<< 15980 1727204142.34571: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py<<< 15980 1727204142.34597: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffe10890><<< 15980 1727204142.34608: stdout chunk (state=3): >>> import 'errno' # <<< 15980 1727204142.34658: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so'<<< 15980 1727204142.34673: stdout chunk (state=3): >>> # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 15980 1727204142.34708: stdout chunk (state=3): >>>import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f16ffe11fa0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py<<< 15980 1727204142.34729: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 15980 1727204142.34777: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc'<<< 15980 1727204142.34789: stdout chunk (state=3): >>> <<< 15980 1727204142.34847: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffe12e40> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 15980 1727204142.34850: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so'<<< 15980 1727204142.34890: stdout chunk (state=3): >>> import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f16ffe134a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffe12390> <<< 15980 1727204142.34908: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py<<< 15980 1727204142.34934: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 15980 1727204142.34996: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so'<<< 15980 1727204142.35012: stdout chunk (state=3): >>> # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so'<<< 15980 1727204142.35031: stdout chunk (state=3): >>> import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f16ffe13e60> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffe13590> <<< 15980 1727204142.35125: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffdf66c0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py<<< 15980 1727204142.35143: stdout chunk (state=3): >>> <<< 15980 1727204142.35163: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc'<<< 15980 1727204142.35201: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 15980 1727204142.35233: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 15980 1727204142.35475: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f16ffbd7d10> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f16ffc04890> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffc045f0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f16ffc048c0> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f16ffc04aa0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffbd5eb0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 15980 1727204142.35588: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 15980 1727204142.35613: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 15980 1727204142.35650: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 15980 1727204142.35676: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffc060f0> <<< 15980 1727204142.35713: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffc04d70> <<< 15980 1727204142.35758: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffdf6db0><<< 15980 1727204142.35779: stdout chunk (state=3): >>> <<< 15980 1727204142.35800: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 15980 1727204142.35891: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc'<<< 15980 1727204142.35911: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py<<< 15980 1727204142.35939: stdout chunk (state=3): >>> <<< 15980 1727204142.35997: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 15980 1727204142.36053: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffc2a4b0> <<< 15980 1727204142.36152: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 15980 1727204142.36183: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 15980 1727204142.36196: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 15980 1727204142.36513: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffc465d0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # <<< 15980 1727204142.36516: stdout chunk (state=3): >>> <<< 15980 1727204142.36539: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py<<< 15980 1727204142.36557: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc'<<< 15980 1727204142.36598: stdout chunk (state=3): >>> import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffc7f350> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py<<< 15980 1727204142.36618: stdout chunk (state=3): >>> <<< 15980 1727204142.36656: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc'<<< 15980 1727204142.36677: stdout chunk (state=3): >>> <<< 15980 1727204142.36763: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 15980 1727204142.36911: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffca5af0> <<< 15980 1727204142.37029: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffc7f470> <<< 15980 1727204142.37099: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffc47260> <<< 15980 1727204142.37139: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py <<< 15980 1727204142.37179: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffa80440><<< 15980 1727204142.37197: stdout chunk (state=3): >>> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffc45610> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffc07050><<< 15980 1727204142.37253: stdout chunk (state=3): >>> <<< 15980 1727204142.37364: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 15980 1727204142.37403: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f16ffc453a0> <<< 15980 1727204142.37529: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_dbn5b074/ansible_stat_payload.zip'<<< 15980 1727204142.37546: stdout chunk (state=3): >>> # zipimport: zlib available<<< 15980 1727204142.37657: stdout chunk (state=3): >>> <<< 15980 1727204142.37822: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204142.37875: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py<<< 15980 1727204142.37879: stdout chunk (state=3): >>> <<< 15980 1727204142.37892: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 15980 1727204142.37955: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py<<< 15980 1727204142.37972: stdout chunk (state=3): >>> <<< 15980 1727204142.38075: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc'<<< 15980 1727204142.38084: stdout chunk (state=3): >>> <<< 15980 1727204142.38118: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py<<< 15980 1727204142.38142: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' <<< 15980 1727204142.38174: stdout chunk (state=3): >>>import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffada120> import '_typing' # <<< 15980 1727204142.38196: stdout chunk (state=3): >>> <<< 15980 1727204142.38497: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffab1010> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffab01a0><<< 15980 1727204142.38512: stdout chunk (state=3): >>> # zipimport: zlib available<<< 15980 1727204142.38548: stdout chunk (state=3): >>> import 'ansible' # <<< 15980 1727204142.38578: stdout chunk (state=3): >>> # zipimport: zlib available<<< 15980 1727204142.38596: stdout chunk (state=3): >>> # zipimport: zlib available<<< 15980 1727204142.38639: stdout chunk (state=3): >>> # zipimport: zlib available <<< 15980 1727204142.38659: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 15980 1727204142.38685: stdout chunk (state=3): >>> # zipimport: zlib available <<< 15980 1727204142.41111: stdout chunk (state=3): >>># zipimport: zlib available<<< 15980 1727204142.41276: stdout chunk (state=3): >>> <<< 15980 1727204142.43163: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc'<<< 15980 1727204142.43202: stdout chunk (state=3): >>> import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffab3fb0> <<< 15980 1727204142.43232: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py <<< 15980 1727204142.43272: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 15980 1727204142.43276: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 15980 1727204142.43323: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 15980 1727204142.43327: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py<<< 15980 1727204142.43347: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 15980 1727204142.43395: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 15980 1727204142.43399: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 15980 1727204142.43418: stdout chunk (state=3): >>>import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f16ffb05b50> <<< 15980 1727204142.43471: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffb058e0><<< 15980 1727204142.43486: stdout chunk (state=3): >>> <<< 15980 1727204142.43515: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffb051f0><<< 15980 1727204142.43525: stdout chunk (state=3): >>> <<< 15980 1727204142.43546: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py<<< 15980 1727204142.43577: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc'<<< 15980 1727204142.43647: stdout chunk (state=3): >>> import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffb05640> <<< 15980 1727204142.43652: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffadadb0> <<< 15980 1727204142.43678: stdout chunk (state=3): >>>import 'atexit' # <<< 15980 1727204142.43712: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' <<< 15980 1727204142.43728: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f16ffb068d0><<< 15980 1727204142.43774: stdout chunk (state=3): >>> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so'<<< 15980 1727204142.43796: stdout chunk (state=3): >>> # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 15980 1727204142.43827: stdout chunk (state=3): >>>import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f16ffb06b10> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 15980 1727204142.43899: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc'<<< 15980 1727204142.43933: stdout chunk (state=3): >>> import '_locale' # <<< 15980 1727204142.44010: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffb06ff0><<< 15980 1727204142.44031: stdout chunk (state=3): >>> import 'pwd' # <<< 15980 1727204142.44091: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 15980 1727204142.44158: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ff968d40> <<< 15980 1727204142.44210: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so'<<< 15980 1727204142.44214: stdout chunk (state=3): >>> # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so'<<< 15980 1727204142.44224: stdout chunk (state=3): >>> import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f16ff96a960><<< 15980 1727204142.44253: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 15980 1727204142.44313: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc'<<< 15980 1727204142.44345: stdout chunk (state=3): >>> import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ff96b320> <<< 15980 1727204142.44380: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 15980 1727204142.44436: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 15980 1727204142.44452: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ff96c500><<< 15980 1727204142.44496: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 15980 1727204142.44555: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 15980 1727204142.44595: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py<<< 15980 1727204142.44606: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc'<<< 15980 1727204142.44697: stdout chunk (state=3): >>> import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ff96efc0> <<< 15980 1727204142.44781: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so'<<< 15980 1727204142.44794: stdout chunk (state=3): >>> # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f16ff96f0e0> <<< 15980 1727204142.44834: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ff96d280> <<< 15980 1727204142.44873: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 15980 1727204142.44924: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc'<<< 15980 1727204142.44931: stdout chunk (state=3): >>> <<< 15980 1727204142.44949: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc'<<< 15980 1727204142.44994: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 15980 1727204142.45036: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc'<<< 15980 1727204142.45084: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 15980 1727204142.45135: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 15980 1727204142.45139: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ff972ea0> import '_tokenize' # <<< 15980 1727204142.45142: stdout chunk (state=3): >>> <<< 15980 1727204142.45235: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ff971970> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ff971700><<< 15980 1727204142.45264: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 15980 1727204142.45281: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc'<<< 15980 1727204142.45420: stdout chunk (state=3): >>> import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ff973b00><<< 15980 1727204142.45431: stdout chunk (state=3): >>> <<< 15980 1727204142.45460: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ff96d790> <<< 15980 1727204142.45516: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 15980 1727204142.45536: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f16ff9bb020> <<< 15980 1727204142.45599: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py<<< 15980 1727204142.45602: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ff9bb1a0><<< 15980 1727204142.45636: stdout chunk (state=3): >>> <<< 15980 1727204142.45639: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 15980 1727204142.45678: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 15980 1727204142.45710: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py <<< 15980 1727204142.45721: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc'<<< 15980 1727204142.45762: stdout chunk (state=3): >>> # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 15980 1727204142.45799: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f16ff9bcd70> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ff9bcb30><<< 15980 1727204142.45832: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 15980 1727204142.45987: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc'<<< 15980 1727204142.46059: stdout chunk (state=3): >>> # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 15980 1727204142.46100: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f16ff9bf260> <<< 15980 1727204142.46103: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ff9bd430><<< 15980 1727204142.46136: stdout chunk (state=3): >>> <<< 15980 1727204142.46154: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py<<< 15980 1727204142.46213: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc'<<< 15980 1727204142.46247: stdout chunk (state=3): >>> <<< 15980 1727204142.46283: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 15980 1727204142.46304: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 15980 1727204142.46315: stdout chunk (state=3): >>> <<< 15980 1727204142.46381: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ff9c6930><<< 15980 1727204142.46397: stdout chunk (state=3): >>> <<< 15980 1727204142.46661: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ff9bf2c0> <<< 15980 1727204142.46721: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 15980 1727204142.46741: stdout chunk (state=3): >>>import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f16ff9c77a0> <<< 15980 1727204142.46770: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so'<<< 15980 1727204142.46804: stdout chunk (state=3): >>> # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f16ff9c7980><<< 15980 1727204142.46881: stdout chunk (state=3): >>> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so'<<< 15980 1727204142.46908: stdout chunk (state=3): >>> # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f16ff9c7b30> <<< 15980 1727204142.46932: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ff9bb410> <<< 15980 1727204142.46967: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py<<< 15980 1727204142.46995: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc'<<< 15980 1727204142.47016: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 15980 1727204142.47042: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc'<<< 15980 1727204142.47089: stdout chunk (state=3): >>> # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 15980 1727204142.47145: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 15980 1727204142.47158: stdout chunk (state=3): >>>import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f16ff9cb3e0><<< 15980 1727204142.47259: stdout chunk (state=3): >>> <<< 15980 1727204142.47434: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so'<<< 15980 1727204142.47478: stdout chunk (state=3): >>> # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 15980 1727204142.47501: stdout chunk (state=3): >>>import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f16ff9cc6b0> <<< 15980 1727204142.47511: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ff9c9b50> <<< 15980 1727204142.47545: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so'<<< 15980 1727204142.47572: stdout chunk (state=3): >>> # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so'<<< 15980 1727204142.47601: stdout chunk (state=3): >>> import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f16ff9caf00> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ff9c9760> <<< 15980 1727204142.47625: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204142.47645: stdout chunk (state=3): >>># zipimport: zlib available<<< 15980 1727204142.47664: stdout chunk (state=3): >>> import 'ansible.module_utils.compat' # <<< 15980 1727204142.47707: stdout chunk (state=3): >>> # zipimport: zlib available <<< 15980 1727204142.47856: stdout chunk (state=3): >>># zipimport: zlib available<<< 15980 1727204142.47958: stdout chunk (state=3): >>> <<< 15980 1727204142.48040: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204142.48074: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 15980 1727204142.48104: stdout chunk (state=3): >>># zipimport: zlib available<<< 15980 1727204142.48143: stdout chunk (state=3): >>> # zipimport: zlib available<<< 15980 1727204142.48173: stdout chunk (state=3): >>> import 'ansible.module_utils.common.text' # <<< 15980 1727204142.48192: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204142.48458: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204142.48622: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204142.49777: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204142.50744: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 15980 1727204142.50749: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 15980 1727204142.50752: stdout chunk (state=3): >>> import 'ansible.module_utils.common.text.converters' # <<< 15980 1727204142.50790: stdout chunk (state=3): >>> # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 15980 1727204142.50794: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 15980 1727204142.50887: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so'<<< 15980 1727204142.50974: stdout chunk (state=3): >>> import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f16ffa54860> <<< 15980 1727204142.51033: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 15980 1727204142.51070: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 15980 1727204142.51103: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffa55640> <<< 15980 1727204142.51116: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ff9cfdd0> <<< 15980 1727204142.51158: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 15980 1727204142.51215: stdout chunk (state=3): >>> # zipimport: zlib available <<< 15980 1727204142.51227: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204142.51274: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 15980 1727204142.51669: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204142.51834: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 15980 1727204142.51838: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 15980 1727204142.51879: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffa553a0> # zipimport: zlib available <<< 15980 1727204142.52723: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204142.53582: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204142.53706: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204142.53858: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 15980 1727204142.53879: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204142.53938: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204142.54208: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 15980 1727204142.54212: stdout chunk (state=3): >>> # zipimport: zlib available # zipimport: zlib available <<< 15980 1727204142.54307: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available <<< 15980 1727204142.54352: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available<<< 15980 1727204142.54376: stdout chunk (state=3): >>> <<< 15980 1727204142.54422: stdout chunk (state=3): >>># zipimport: zlib available<<< 15980 1727204142.54490: stdout chunk (state=3): >>> import 'ansible.module_utils.parsing.convert_bool' # <<< 15980 1727204142.54549: stdout chunk (state=3): >>># zipimport: zlib available<<< 15980 1727204142.54562: stdout chunk (state=3): >>> <<< 15980 1727204142.54946: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204142.55371: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py<<< 15980 1727204142.55481: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 15980 1727204142.55507: stdout chunk (state=3): >>>import '_ast' # <<< 15980 1727204142.55633: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffa56420><<< 15980 1727204142.55649: stdout chunk (state=3): >>> # zipimport: zlib available <<< 15980 1727204142.55769: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204142.55903: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 15980 1727204142.55924: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 15980 1727204142.55942: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 15980 1727204142.55991: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 15980 1727204142.56002: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 15980 1727204142.56157: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 15980 1727204142.56308: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f16ff862120><<< 15980 1727204142.56326: stdout chunk (state=3): >>> <<< 15980 1727204142.56385: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 15980 1727204142.56414: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f16ff862a80> <<< 15980 1727204142.56463: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffa572c0> # zipimport: zlib available<<< 15980 1727204142.56468: stdout chunk (state=3): >>> <<< 15980 1727204142.56533: stdout chunk (state=3): >>># zipimport: zlib available<<< 15980 1727204142.56667: stdout chunk (state=3): >>> import 'ansible.module_utils.common.locale' # <<< 15980 1727204142.56703: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15980 1727204142.56781: stdout chunk (state=3): >>># zipimport: zlib available<<< 15980 1727204142.56806: stdout chunk (state=3): >>> <<< 15980 1727204142.56876: stdout chunk (state=3): >>># zipimport: zlib available<<< 15980 1727204142.56996: stdout chunk (state=3): >>> # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 15980 1727204142.57074: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 15980 1727204142.57221: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 15980 1727204142.57235: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f16ff861790> <<< 15980 1727204142.57310: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ff862c60> <<< 15980 1727204142.57363: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # <<< 15980 1727204142.57398: stdout chunk (state=3): >>> import 'ansible.module_utils.common.process' # <<< 15980 1727204142.57413: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204142.57561: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204142.57621: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204142.57672: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204142.57751: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 15980 1727204142.57762: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 15980 1727204142.57840: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 15980 1727204142.57862: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 15980 1727204142.57950: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc'<<< 15980 1727204142.57979: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 15980 1727204142.58012: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 15980 1727204142.58181: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ff8f2e40> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ff86cbc0> <<< 15980 1727204142.58327: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ff86ac60> <<< 15980 1727204142.58331: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ff86aab0><<< 15980 1727204142.58345: stdout chunk (state=3): >>> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 15980 1727204142.58379: stdout chunk (state=3): >>># zipimport: zlib available <<< 15980 1727204142.58472: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common._utils' # <<< 15980 1727204142.58475: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # <<< 15980 1727204142.58551: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 15980 1727204142.58582: stdout chunk (state=3): >>> # zipimport: zlib available<<< 15980 1727204142.58624: stdout chunk (state=3): >>> <<< 15980 1727204142.58642: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 15980 1727204142.58877: stdout chunk (state=3): >>># zipimport: zlib available<<< 15980 1727204142.59070: stdout chunk (state=3): >>> <<< 15980 1727204142.59220: stdout chunk (state=3): >>># zipimport: zlib available<<< 15980 1727204142.59231: stdout chunk (state=3): >>> <<< 15980 1727204142.59420: stdout chunk (state=3): >>> <<< 15980 1727204142.59434: stdout chunk (state=3): >>>{"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 15980 1727204142.59463: stdout chunk (state=3): >>># destroy __main__ <<< 15980 1727204142.59962: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks<<< 15980 1727204142.59996: stdout chunk (state=3): >>> # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc<<< 15980 1727204142.60055: stdout chunk (state=3): >>> # clear sys.last_type # clear sys.last_value # clear sys.last_traceback <<< 15980 1727204142.60096: stdout chunk (state=3): >>># clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin<<< 15980 1727204142.60112: stdout chunk (state=3): >>> # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread<<< 15980 1727204142.60155: stdout chunk (state=3): >>> # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal<<< 15980 1727204142.60182: stdout chunk (state=3): >>> # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins<<< 15980 1727204142.60215: stdout chunk (state=3): >>> # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections<<< 15980 1727204142.60252: stdout chunk (state=3): >>> # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64<<< 15980 1727204142.60275: stdout chunk (state=3): >>> # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma<<< 15980 1727204142.60394: stdout chunk (state=3): >>> # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cle<<< 15980 1727204142.60424: stdout chunk (state=3): >>>anup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 15980 1727204142.60962: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime <<< 15980 1727204142.60970: stdout chunk (state=3): >>># destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess <<< 15980 1727204142.61056: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal <<< 15980 1727204142.61069: stdout chunk (state=3): >>># cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize<<< 15980 1727204142.61081: stdout chunk (state=3): >>> # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 15980 1727204142.61191: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings <<< 15980 1727204142.61242: stdout chunk (state=3): >>># cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon <<< 15980 1727204142.61246: stdout chunk (state=3): >>># destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 15980 1727204142.61399: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 15980 1727204142.61572: stdout chunk (state=3): >>># destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 15980 1727204142.61675: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs <<< 15980 1727204142.61708: stdout chunk (state=3): >>># destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect <<< 15980 1727204142.61733: stdout chunk (state=3): >>># destroy time # destroy _random # destroy _weakref <<< 15980 1727204142.61789: stdout chunk (state=3): >>># destroy _operator # destroy _sha2 # destroy _string # destroy re <<< 15980 1727204142.61795: stdout chunk (state=3): >>># destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins<<< 15980 1727204142.61823: stdout chunk (state=3): >>> # destroy _thread # clear sys.audit hooks <<< 15980 1727204142.62374: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204142.62402: stderr chunk (state=3): >>>Shared connection to 10.31.47.73 closed. <<< 15980 1727204142.62405: stdout chunk (state=3): >>><<< 15980 1727204142.62408: stderr chunk (state=3): >>><<< 15980 1727204142.62512: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16fff18530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffee7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16fff1aab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffd2d190> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffd2e090> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffd6bf80> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffd80110> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffda3950> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffda3fe0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffd83c20> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffd81370> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffd69130> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffdc78c0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffdc64e0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffd82210> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffdc4d70> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffdf4980> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffd683b0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f16ffdf4e30> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffdf4ce0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f16ffdf50a0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffd66ed0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffdf5760> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffdf5430> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffdf6660> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffe10890> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f16ffe11fa0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffe12e40> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f16ffe134a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffe12390> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f16ffe13e60> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffe13590> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffdf66c0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f16ffbd7d10> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f16ffc04890> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffc045f0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f16ffc048c0> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f16ffc04aa0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffbd5eb0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffc060f0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffc04d70> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffdf6db0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffc2a4b0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffc465d0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffc7f350> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffca5af0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffc7f470> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffc47260> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffa80440> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffc45610> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffc07050> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f16ffc453a0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_dbn5b074/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffada120> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffab1010> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffab01a0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffab3fb0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f16ffb05b50> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffb058e0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffb051f0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffb05640> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffadadb0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f16ffb068d0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f16ffb06b10> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffb06ff0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ff968d40> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f16ff96a960> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ff96b320> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ff96c500> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ff96efc0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f16ff96f0e0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ff96d280> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ff972ea0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ff971970> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ff971700> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ff973b00> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ff96d790> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f16ff9bb020> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ff9bb1a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f16ff9bcd70> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ff9bcb30> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f16ff9bf260> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ff9bd430> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ff9c6930> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ff9bf2c0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f16ff9c77a0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f16ff9c7980> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f16ff9c7b30> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ff9bb410> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f16ff9cb3e0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f16ff9cc6b0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ff9c9b50> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f16ff9caf00> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ff9c9760> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f16ffa54860> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffa55640> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ff9cfdd0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffa553a0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffa56420> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f16ff862120> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f16ff862a80> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ffa572c0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f16ff861790> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ff862c60> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ff8f2e40> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ff86cbc0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ff86ac60> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f16ff86aab0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 15980 1727204142.63300: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204141.954911-16485-203793281700435/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15980 1727204142.63303: _low_level_execute_command(): starting 15980 1727204142.63306: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204141.954911-16485-203793281700435/ > /dev/null 2>&1 && sleep 0' 15980 1727204142.63848: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204142.63892: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204142.63939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15980 1727204142.63954: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204142.64030: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204142.64081: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204142.64085: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204142.64201: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15980 1727204142.66955: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204142.66959: stdout chunk (state=3): >>><<< 15980 1727204142.66962: stderr chunk (state=3): >>><<< 15980 1727204142.66983: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 15980 1727204142.66995: handler run complete 15980 1727204142.67022: attempt loop complete, returning result 15980 1727204142.67060: _execute() done 15980 1727204142.67064: dumping result to json 15980 1727204142.67068: done dumping result, returning 15980 1727204142.67071: done running TaskExecutor() for managed-node2/TASK: Check if system is ostree [127b8e07-fff9-5f1d-4b72-000000000091] 15980 1727204142.67073: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000091 15980 1727204142.67236: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000091 15980 1727204142.67239: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 15980 1727204142.67309: no more pending results, returning what we have 15980 1727204142.67313: results queue empty 15980 1727204142.67314: checking for any_errors_fatal 15980 1727204142.67320: done checking for any_errors_fatal 15980 1727204142.67321: checking for max_fail_percentage 15980 1727204142.67323: done checking for max_fail_percentage 15980 1727204142.67324: checking to see if all hosts have failed and the running result is not ok 15980 1727204142.67325: done checking to see if all hosts have failed 15980 1727204142.67326: getting the remaining hosts for this loop 15980 1727204142.67328: done getting the remaining hosts for this loop 15980 1727204142.67332: getting the next task for host managed-node2 15980 1727204142.67338: done getting next task for host managed-node2 15980 1727204142.67341: ^ task is: TASK: Set flag to indicate system is ostree 15980 1727204142.67344: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204142.67347: getting variables 15980 1727204142.67349: in VariableManager get_vars() 15980 1727204142.67383: Calling all_inventory to load vars for managed-node2 15980 1727204142.67386: Calling groups_inventory to load vars for managed-node2 15980 1727204142.67389: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204142.67401: Calling all_plugins_play to load vars for managed-node2 15980 1727204142.67404: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204142.67407: Calling groups_plugins_play to load vars for managed-node2 15980 1727204142.67937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204142.68170: done with get_vars() 15980 1727204142.68185: done getting variables 15980 1727204142.68307: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Tuesday 24 September 2024 14:55:42 -0400 (0:00:00.803) 0:00:04.093 ***** 15980 1727204142.68351: entering _queue_task() for managed-node2/set_fact 15980 1727204142.68354: Creating lock for set_fact 15980 1727204142.68815: worker is 1 (out of 1 available) 15980 1727204142.68828: exiting _queue_task() for managed-node2/set_fact 15980 1727204142.68843: done queuing things up, now waiting for results queue to drain 15980 1727204142.68846: waiting for pending results... 15980 1727204142.69185: running TaskExecutor() for managed-node2/TASK: Set flag to indicate system is ostree 15980 1727204142.69229: in run() - task 127b8e07-fff9-5f1d-4b72-000000000092 15980 1727204142.69249: variable 'ansible_search_path' from source: unknown 15980 1727204142.69257: variable 'ansible_search_path' from source: unknown 15980 1727204142.69302: calling self._execute() 15980 1727204142.69400: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204142.69421: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204142.69532: variable 'omit' from source: magic vars 15980 1727204142.69979: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15980 1727204142.70316: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15980 1727204142.70372: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15980 1727204142.70421: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15980 1727204142.70458: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15980 1727204142.70562: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15980 1727204142.70594: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15980 1727204142.70639: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204142.70674: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15980 1727204142.70825: Evaluated conditional (not __network_is_ostree is defined): True 15980 1727204142.70849: variable 'omit' from source: magic vars 15980 1727204142.70904: variable 'omit' from source: magic vars 15980 1727204142.71067: variable '__ostree_booted_stat' from source: set_fact 15980 1727204142.71166: variable 'omit' from source: magic vars 15980 1727204142.71174: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204142.71204: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204142.71232: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204142.71256: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204142.71284: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204142.71321: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204142.71385: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204142.71393: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204142.71468: Set connection var ansible_connection to ssh 15980 1727204142.71483: Set connection var ansible_pipelining to False 15980 1727204142.71508: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204142.71519: Set connection var ansible_timeout to 10 15980 1727204142.71527: Set connection var ansible_shell_type to sh 15980 1727204142.71537: Set connection var ansible_shell_executable to /bin/sh 15980 1727204142.71571: variable 'ansible_shell_executable' from source: unknown 15980 1727204142.71580: variable 'ansible_connection' from source: unknown 15980 1727204142.71586: variable 'ansible_module_compression' from source: unknown 15980 1727204142.71602: variable 'ansible_shell_type' from source: unknown 15980 1727204142.71605: variable 'ansible_shell_executable' from source: unknown 15980 1727204142.71673: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204142.71675: variable 'ansible_pipelining' from source: unknown 15980 1727204142.71677: variable 'ansible_timeout' from source: unknown 15980 1727204142.71679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204142.71767: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15980 1727204142.71786: variable 'omit' from source: magic vars 15980 1727204142.71797: starting attempt loop 15980 1727204142.71804: running the handler 15980 1727204142.71832: handler run complete 15980 1727204142.71847: attempt loop complete, returning result 15980 1727204142.71854: _execute() done 15980 1727204142.71860: dumping result to json 15980 1727204142.71869: done dumping result, returning 15980 1727204142.71880: done running TaskExecutor() for managed-node2/TASK: Set flag to indicate system is ostree [127b8e07-fff9-5f1d-4b72-000000000092] 15980 1727204142.71930: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000092 15980 1727204142.72008: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000092 15980 1727204142.72012: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 15980 1727204142.72096: no more pending results, returning what we have 15980 1727204142.72100: results queue empty 15980 1727204142.72101: checking for any_errors_fatal 15980 1727204142.72107: done checking for any_errors_fatal 15980 1727204142.72108: checking for max_fail_percentage 15980 1727204142.72110: done checking for max_fail_percentage 15980 1727204142.72111: checking to see if all hosts have failed and the running result is not ok 15980 1727204142.72112: done checking to see if all hosts have failed 15980 1727204142.72113: getting the remaining hosts for this loop 15980 1727204142.72115: done getting the remaining hosts for this loop 15980 1727204142.72119: getting the next task for host managed-node2 15980 1727204142.72129: done getting next task for host managed-node2 15980 1727204142.72132: ^ task is: TASK: Fix CentOS6 Base repo 15980 1727204142.72134: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204142.72141: getting variables 15980 1727204142.72143: in VariableManager get_vars() 15980 1727204142.72293: Calling all_inventory to load vars for managed-node2 15980 1727204142.72297: Calling groups_inventory to load vars for managed-node2 15980 1727204142.72301: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204142.72315: Calling all_plugins_play to load vars for managed-node2 15980 1727204142.72318: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204142.72330: Calling groups_plugins_play to load vars for managed-node2 15980 1727204142.72788: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204142.72984: done with get_vars() 15980 1727204142.72996: done getting variables 15980 1727204142.73132: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Tuesday 24 September 2024 14:55:42 -0400 (0:00:00.048) 0:00:04.141 ***** 15980 1727204142.73170: entering _queue_task() for managed-node2/copy 15980 1727204142.73608: worker is 1 (out of 1 available) 15980 1727204142.73621: exiting _queue_task() for managed-node2/copy 15980 1727204142.73632: done queuing things up, now waiting for results queue to drain 15980 1727204142.73634: waiting for pending results... 15980 1727204142.73878: running TaskExecutor() for managed-node2/TASK: Fix CentOS6 Base repo 15980 1727204142.73926: in run() - task 127b8e07-fff9-5f1d-4b72-000000000094 15980 1727204142.73972: variable 'ansible_search_path' from source: unknown 15980 1727204142.73977: variable 'ansible_search_path' from source: unknown 15980 1727204142.74002: calling self._execute() 15980 1727204142.74101: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204142.74136: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204142.74139: variable 'omit' from source: magic vars 15980 1727204142.74697: variable 'ansible_distribution' from source: facts 15980 1727204142.74739: Evaluated conditional (ansible_distribution == 'CentOS'): False 15980 1727204142.74743: when evaluation is False, skipping this task 15980 1727204142.74793: _execute() done 15980 1727204142.74796: dumping result to json 15980 1727204142.74799: done dumping result, returning 15980 1727204142.74802: done running TaskExecutor() for managed-node2/TASK: Fix CentOS6 Base repo [127b8e07-fff9-5f1d-4b72-000000000094] 15980 1727204142.74804: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000094 skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution == 'CentOS'", "skip_reason": "Conditional result was False" } 15980 1727204142.75090: no more pending results, returning what we have 15980 1727204142.75094: results queue empty 15980 1727204142.75095: checking for any_errors_fatal 15980 1727204142.75101: done checking for any_errors_fatal 15980 1727204142.75102: checking for max_fail_percentage 15980 1727204142.75103: done checking for max_fail_percentage 15980 1727204142.75104: checking to see if all hosts have failed and the running result is not ok 15980 1727204142.75105: done checking to see if all hosts have failed 15980 1727204142.75106: getting the remaining hosts for this loop 15980 1727204142.75108: done getting the remaining hosts for this loop 15980 1727204142.75114: getting the next task for host managed-node2 15980 1727204142.75123: done getting next task for host managed-node2 15980 1727204142.75125: ^ task is: TASK: Include the task 'enable_epel.yml' 15980 1727204142.75129: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204142.75133: getting variables 15980 1727204142.75134: in VariableManager get_vars() 15980 1727204142.75376: Calling all_inventory to load vars for managed-node2 15980 1727204142.75380: Calling groups_inventory to load vars for managed-node2 15980 1727204142.75384: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204142.75392: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000094 15980 1727204142.75395: WORKER PROCESS EXITING 15980 1727204142.75406: Calling all_plugins_play to load vars for managed-node2 15980 1727204142.75409: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204142.75412: Calling groups_plugins_play to load vars for managed-node2 15980 1727204142.75603: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204142.75802: done with get_vars() 15980 1727204142.75822: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Tuesday 24 September 2024 14:55:42 -0400 (0:00:00.027) 0:00:04.169 ***** 15980 1727204142.75926: entering _queue_task() for managed-node2/include_tasks 15980 1727204142.76363: worker is 1 (out of 1 available) 15980 1727204142.76377: exiting _queue_task() for managed-node2/include_tasks 15980 1727204142.76388: done queuing things up, now waiting for results queue to drain 15980 1727204142.76390: waiting for pending results... 15980 1727204142.76560: running TaskExecutor() for managed-node2/TASK: Include the task 'enable_epel.yml' 15980 1727204142.76726: in run() - task 127b8e07-fff9-5f1d-4b72-000000000095 15980 1727204142.76729: variable 'ansible_search_path' from source: unknown 15980 1727204142.76732: variable 'ansible_search_path' from source: unknown 15980 1727204142.76794: calling self._execute() 15980 1727204142.76857: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204142.76871: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204142.76886: variable 'omit' from source: magic vars 15980 1727204142.77592: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15980 1727204142.80019: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15980 1727204142.80117: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15980 1727204142.80171: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15980 1727204142.80271: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15980 1727204142.80275: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15980 1727204142.80339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204142.80383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204142.80430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204142.80487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204142.80517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204142.80773: variable '__network_is_ostree' from source: set_fact 15980 1727204142.80776: Evaluated conditional (not __network_is_ostree | d(false)): True 15980 1727204142.80779: _execute() done 15980 1727204142.80781: dumping result to json 15980 1727204142.80783: done dumping result, returning 15980 1727204142.80785: done running TaskExecutor() for managed-node2/TASK: Include the task 'enable_epel.yml' [127b8e07-fff9-5f1d-4b72-000000000095] 15980 1727204142.80787: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000095 15980 1727204142.80874: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000095 15980 1727204142.80879: WORKER PROCESS EXITING 15980 1727204142.80911: no more pending results, returning what we have 15980 1727204142.80917: in VariableManager get_vars() 15980 1727204142.80958: Calling all_inventory to load vars for managed-node2 15980 1727204142.80961: Calling groups_inventory to load vars for managed-node2 15980 1727204142.80968: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204142.80982: Calling all_plugins_play to load vars for managed-node2 15980 1727204142.80985: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204142.80991: Calling groups_plugins_play to load vars for managed-node2 15980 1727204142.81459: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204142.81808: done with get_vars() 15980 1727204142.81817: variable 'ansible_search_path' from source: unknown 15980 1727204142.81818: variable 'ansible_search_path' from source: unknown 15980 1727204142.81873: we have included files to process 15980 1727204142.81875: generating all_blocks data 15980 1727204142.81877: done generating all_blocks data 15980 1727204142.81882: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 15980 1727204142.81884: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 15980 1727204142.81887: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 15980 1727204142.82748: done processing included file 15980 1727204142.82751: iterating over new_blocks loaded from include file 15980 1727204142.82752: in VariableManager get_vars() 15980 1727204142.82768: done with get_vars() 15980 1727204142.82770: filtering new block on tags 15980 1727204142.82797: done filtering new block on tags 15980 1727204142.82800: in VariableManager get_vars() 15980 1727204142.82819: done with get_vars() 15980 1727204142.82821: filtering new block on tags 15980 1727204142.82838: done filtering new block on tags 15980 1727204142.82841: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed-node2 15980 1727204142.82872: extending task lists for all hosts with included blocks 15980 1727204142.83000: done extending task lists 15980 1727204142.83002: done processing included files 15980 1727204142.83003: results queue empty 15980 1727204142.83004: checking for any_errors_fatal 15980 1727204142.83007: done checking for any_errors_fatal 15980 1727204142.83008: checking for max_fail_percentage 15980 1727204142.83009: done checking for max_fail_percentage 15980 1727204142.83010: checking to see if all hosts have failed and the running result is not ok 15980 1727204142.83011: done checking to see if all hosts have failed 15980 1727204142.83011: getting the remaining hosts for this loop 15980 1727204142.83013: done getting the remaining hosts for this loop 15980 1727204142.83015: getting the next task for host managed-node2 15980 1727204142.83020: done getting next task for host managed-node2 15980 1727204142.83022: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 15980 1727204142.83025: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204142.83028: getting variables 15980 1727204142.83029: in VariableManager get_vars() 15980 1727204142.83046: Calling all_inventory to load vars for managed-node2 15980 1727204142.83049: Calling groups_inventory to load vars for managed-node2 15980 1727204142.83052: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204142.83058: Calling all_plugins_play to load vars for managed-node2 15980 1727204142.83069: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204142.83073: Calling groups_plugins_play to load vars for managed-node2 15980 1727204142.83229: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204142.83445: done with get_vars() 15980 1727204142.83455: done getting variables 15980 1727204142.83540: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 15980 1727204142.83782: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 40] ********************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Tuesday 24 September 2024 14:55:42 -0400 (0:00:00.079) 0:00:04.248 ***** 15980 1727204142.83842: entering _queue_task() for managed-node2/command 15980 1727204142.83844: Creating lock for command 15980 1727204142.84223: worker is 1 (out of 1 available) 15980 1727204142.84474: exiting _queue_task() for managed-node2/command 15980 1727204142.84487: done queuing things up, now waiting for results queue to drain 15980 1727204142.84489: waiting for pending results... 15980 1727204142.84716: running TaskExecutor() for managed-node2/TASK: Create EPEL 40 15980 1727204142.84723: in run() - task 127b8e07-fff9-5f1d-4b72-0000000000af 15980 1727204142.84784: variable 'ansible_search_path' from source: unknown 15980 1727204142.84788: variable 'ansible_search_path' from source: unknown 15980 1727204142.84791: calling self._execute() 15980 1727204142.84896: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204142.84914: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204142.84936: variable 'omit' from source: magic vars 15980 1727204142.85406: variable 'ansible_distribution' from source: facts 15980 1727204142.85426: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 15980 1727204142.85441: when evaluation is False, skipping this task 15980 1727204142.85455: _execute() done 15980 1727204142.85473: dumping result to json 15980 1727204142.85476: done dumping result, returning 15980 1727204142.85546: done running TaskExecutor() for managed-node2/TASK: Create EPEL 40 [127b8e07-fff9-5f1d-4b72-0000000000af] 15980 1727204142.85549: sending task result for task 127b8e07-fff9-5f1d-4b72-0000000000af 15980 1727204142.85781: done sending task result for task 127b8e07-fff9-5f1d-4b72-0000000000af 15980 1727204142.85785: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 15980 1727204142.85852: no more pending results, returning what we have 15980 1727204142.85856: results queue empty 15980 1727204142.85857: checking for any_errors_fatal 15980 1727204142.85858: done checking for any_errors_fatal 15980 1727204142.85859: checking for max_fail_percentage 15980 1727204142.85861: done checking for max_fail_percentage 15980 1727204142.85862: checking to see if all hosts have failed and the running result is not ok 15980 1727204142.85863: done checking to see if all hosts have failed 15980 1727204142.85864: getting the remaining hosts for this loop 15980 1727204142.85868: done getting the remaining hosts for this loop 15980 1727204142.85872: getting the next task for host managed-node2 15980 1727204142.85975: done getting next task for host managed-node2 15980 1727204142.85979: ^ task is: TASK: Install yum-utils package 15980 1727204142.85984: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204142.85992: getting variables 15980 1727204142.85994: in VariableManager get_vars() 15980 1727204142.86030: Calling all_inventory to load vars for managed-node2 15980 1727204142.86037: Calling groups_inventory to load vars for managed-node2 15980 1727204142.86041: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204142.86057: Calling all_plugins_play to load vars for managed-node2 15980 1727204142.86061: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204142.86123: Calling groups_plugins_play to load vars for managed-node2 15980 1727204142.86406: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204142.86644: done with get_vars() 15980 1727204142.86657: done getting variables 15980 1727204142.86786: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Tuesday 24 September 2024 14:55:42 -0400 (0:00:00.029) 0:00:04.278 ***** 15980 1727204142.86818: entering _queue_task() for managed-node2/package 15980 1727204142.86820: Creating lock for package 15980 1727204142.87380: worker is 1 (out of 1 available) 15980 1727204142.87394: exiting _queue_task() for managed-node2/package 15980 1727204142.87412: done queuing things up, now waiting for results queue to drain 15980 1727204142.87415: waiting for pending results... 15980 1727204142.87783: running TaskExecutor() for managed-node2/TASK: Install yum-utils package 15980 1727204142.87947: in run() - task 127b8e07-fff9-5f1d-4b72-0000000000b0 15980 1727204142.88055: variable 'ansible_search_path' from source: unknown 15980 1727204142.88059: variable 'ansible_search_path' from source: unknown 15980 1727204142.88062: calling self._execute() 15980 1727204142.88123: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204142.88136: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204142.88151: variable 'omit' from source: magic vars 15980 1727204142.88878: variable 'ansible_distribution' from source: facts 15980 1727204142.88889: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 15980 1727204142.88897: when evaluation is False, skipping this task 15980 1727204142.88905: _execute() done 15980 1727204142.88911: dumping result to json 15980 1727204142.88918: done dumping result, returning 15980 1727204142.88935: done running TaskExecutor() for managed-node2/TASK: Install yum-utils package [127b8e07-fff9-5f1d-4b72-0000000000b0] 15980 1727204142.88947: sending task result for task 127b8e07-fff9-5f1d-4b72-0000000000b0 skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 15980 1727204142.89228: no more pending results, returning what we have 15980 1727204142.89233: results queue empty 15980 1727204142.89234: checking for any_errors_fatal 15980 1727204142.89241: done checking for any_errors_fatal 15980 1727204142.89242: checking for max_fail_percentage 15980 1727204142.89243: done checking for max_fail_percentage 15980 1727204142.89244: checking to see if all hosts have failed and the running result is not ok 15980 1727204142.89246: done checking to see if all hosts have failed 15980 1727204142.89246: getting the remaining hosts for this loop 15980 1727204142.89248: done getting the remaining hosts for this loop 15980 1727204142.89253: getting the next task for host managed-node2 15980 1727204142.89261: done getting next task for host managed-node2 15980 1727204142.89264: ^ task is: TASK: Enable EPEL 7 15980 1727204142.89272: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204142.89276: getting variables 15980 1727204142.89278: in VariableManager get_vars() 15980 1727204142.89313: Calling all_inventory to load vars for managed-node2 15980 1727204142.89317: Calling groups_inventory to load vars for managed-node2 15980 1727204142.89321: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204142.89337: Calling all_plugins_play to load vars for managed-node2 15980 1727204142.89341: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204142.89344: Calling groups_plugins_play to load vars for managed-node2 15980 1727204142.89763: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204142.90062: done with get_vars() 15980 1727204142.90077: done getting variables 15980 1727204142.90120: done sending task result for task 127b8e07-fff9-5f1d-4b72-0000000000b0 15980 1727204142.90124: WORKER PROCESS EXITING 15980 1727204142.90196: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Tuesday 24 September 2024 14:55:42 -0400 (0:00:00.034) 0:00:04.313 ***** 15980 1727204142.90286: entering _queue_task() for managed-node2/command 15980 1727204142.90764: worker is 1 (out of 1 available) 15980 1727204142.90780: exiting _queue_task() for managed-node2/command 15980 1727204142.90797: done queuing things up, now waiting for results queue to drain 15980 1727204142.90800: waiting for pending results... 15980 1727204142.91081: running TaskExecutor() for managed-node2/TASK: Enable EPEL 7 15980 1727204142.91188: in run() - task 127b8e07-fff9-5f1d-4b72-0000000000b1 15980 1727204142.91210: variable 'ansible_search_path' from source: unknown 15980 1727204142.91220: variable 'ansible_search_path' from source: unknown 15980 1727204142.91269: calling self._execute() 15980 1727204142.91387: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204142.91403: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204142.91505: variable 'omit' from source: magic vars 15980 1727204142.91875: variable 'ansible_distribution' from source: facts 15980 1727204142.91898: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 15980 1727204142.91907: when evaluation is False, skipping this task 15980 1727204142.91914: _execute() done 15980 1727204142.91921: dumping result to json 15980 1727204142.91935: done dumping result, returning 15980 1727204142.91949: done running TaskExecutor() for managed-node2/TASK: Enable EPEL 7 [127b8e07-fff9-5f1d-4b72-0000000000b1] 15980 1727204142.91959: sending task result for task 127b8e07-fff9-5f1d-4b72-0000000000b1 15980 1727204142.92306: done sending task result for task 127b8e07-fff9-5f1d-4b72-0000000000b1 15980 1727204142.92309: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 15980 1727204142.92354: no more pending results, returning what we have 15980 1727204142.92357: results queue empty 15980 1727204142.92357: checking for any_errors_fatal 15980 1727204142.92365: done checking for any_errors_fatal 15980 1727204142.92367: checking for max_fail_percentage 15980 1727204142.92369: done checking for max_fail_percentage 15980 1727204142.92370: checking to see if all hosts have failed and the running result is not ok 15980 1727204142.92371: done checking to see if all hosts have failed 15980 1727204142.92371: getting the remaining hosts for this loop 15980 1727204142.92373: done getting the remaining hosts for this loop 15980 1727204142.92376: getting the next task for host managed-node2 15980 1727204142.92382: done getting next task for host managed-node2 15980 1727204142.92384: ^ task is: TASK: Enable EPEL 8 15980 1727204142.92388: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204142.92391: getting variables 15980 1727204142.92393: in VariableManager get_vars() 15980 1727204142.92420: Calling all_inventory to load vars for managed-node2 15980 1727204142.92422: Calling groups_inventory to load vars for managed-node2 15980 1727204142.92425: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204142.92438: Calling all_plugins_play to load vars for managed-node2 15980 1727204142.92441: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204142.92444: Calling groups_plugins_play to load vars for managed-node2 15980 1727204142.92624: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204142.92909: done with get_vars() 15980 1727204142.92922: done getting variables 15980 1727204142.92992: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Tuesday 24 September 2024 14:55:42 -0400 (0:00:00.027) 0:00:04.340 ***** 15980 1727204142.93030: entering _queue_task() for managed-node2/command 15980 1727204142.93598: worker is 1 (out of 1 available) 15980 1727204142.93606: exiting _queue_task() for managed-node2/command 15980 1727204142.93617: done queuing things up, now waiting for results queue to drain 15980 1727204142.93619: waiting for pending results... 15980 1727204142.93679: running TaskExecutor() for managed-node2/TASK: Enable EPEL 8 15980 1727204142.93811: in run() - task 127b8e07-fff9-5f1d-4b72-0000000000b2 15980 1727204142.93834: variable 'ansible_search_path' from source: unknown 15980 1727204142.93844: variable 'ansible_search_path' from source: unknown 15980 1727204142.93891: calling self._execute() 15980 1727204142.93986: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204142.93999: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204142.94013: variable 'omit' from source: magic vars 15980 1727204142.94448: variable 'ansible_distribution' from source: facts 15980 1727204142.94475: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 15980 1727204142.94485: when evaluation is False, skipping this task 15980 1727204142.94495: _execute() done 15980 1727204142.94505: dumping result to json 15980 1727204142.94514: done dumping result, returning 15980 1727204142.94524: done running TaskExecutor() for managed-node2/TASK: Enable EPEL 8 [127b8e07-fff9-5f1d-4b72-0000000000b2] 15980 1727204142.94538: sending task result for task 127b8e07-fff9-5f1d-4b72-0000000000b2 15980 1727204142.94820: done sending task result for task 127b8e07-fff9-5f1d-4b72-0000000000b2 15980 1727204142.94824: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 15980 1727204142.94880: no more pending results, returning what we have 15980 1727204142.94884: results queue empty 15980 1727204142.94885: checking for any_errors_fatal 15980 1727204142.94890: done checking for any_errors_fatal 15980 1727204142.94890: checking for max_fail_percentage 15980 1727204142.94892: done checking for max_fail_percentage 15980 1727204142.94893: checking to see if all hosts have failed and the running result is not ok 15980 1727204142.94894: done checking to see if all hosts have failed 15980 1727204142.94896: getting the remaining hosts for this loop 15980 1727204142.94898: done getting the remaining hosts for this loop 15980 1727204142.94902: getting the next task for host managed-node2 15980 1727204142.94912: done getting next task for host managed-node2 15980 1727204142.94915: ^ task is: TASK: Enable EPEL 6 15980 1727204142.94920: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204142.94924: getting variables 15980 1727204142.94926: in VariableManager get_vars() 15980 1727204142.94963: Calling all_inventory to load vars for managed-node2 15980 1727204142.94969: Calling groups_inventory to load vars for managed-node2 15980 1727204142.94974: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204142.94989: Calling all_plugins_play to load vars for managed-node2 15980 1727204142.94992: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204142.94996: Calling groups_plugins_play to load vars for managed-node2 15980 1727204142.95379: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204142.95589: done with get_vars() 15980 1727204142.95601: done getting variables 15980 1727204142.95670: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Tuesday 24 September 2024 14:55:42 -0400 (0:00:00.026) 0:00:04.367 ***** 15980 1727204142.95705: entering _queue_task() for managed-node2/copy 15980 1727204142.96048: worker is 1 (out of 1 available) 15980 1727204142.96062: exiting _queue_task() for managed-node2/copy 15980 1727204142.96076: done queuing things up, now waiting for results queue to drain 15980 1727204142.96078: waiting for pending results... 15980 1727204142.96491: running TaskExecutor() for managed-node2/TASK: Enable EPEL 6 15980 1727204142.96496: in run() - task 127b8e07-fff9-5f1d-4b72-0000000000b4 15980 1727204142.96504: variable 'ansible_search_path' from source: unknown 15980 1727204142.96512: variable 'ansible_search_path' from source: unknown 15980 1727204142.96563: calling self._execute() 15980 1727204142.96660: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204142.96676: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204142.96695: variable 'omit' from source: magic vars 15980 1727204142.97136: variable 'ansible_distribution' from source: facts 15980 1727204142.97158: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 15980 1727204142.97169: when evaluation is False, skipping this task 15980 1727204142.97177: _execute() done 15980 1727204142.97184: dumping result to json 15980 1727204142.97193: done dumping result, returning 15980 1727204142.97204: done running TaskExecutor() for managed-node2/TASK: Enable EPEL 6 [127b8e07-fff9-5f1d-4b72-0000000000b4] 15980 1727204142.97214: sending task result for task 127b8e07-fff9-5f1d-4b72-0000000000b4 skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 15980 1727204142.97396: no more pending results, returning what we have 15980 1727204142.97401: results queue empty 15980 1727204142.97402: checking for any_errors_fatal 15980 1727204142.97410: done checking for any_errors_fatal 15980 1727204142.97411: checking for max_fail_percentage 15980 1727204142.97413: done checking for max_fail_percentage 15980 1727204142.97414: checking to see if all hosts have failed and the running result is not ok 15980 1727204142.97415: done checking to see if all hosts have failed 15980 1727204142.97416: getting the remaining hosts for this loop 15980 1727204142.97418: done getting the remaining hosts for this loop 15980 1727204142.97422: getting the next task for host managed-node2 15980 1727204142.97436: done getting next task for host managed-node2 15980 1727204142.97440: ^ task is: TASK: Set network provider to 'nm' 15980 1727204142.97443: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204142.97447: getting variables 15980 1727204142.97449: in VariableManager get_vars() 15980 1727204142.97485: Calling all_inventory to load vars for managed-node2 15980 1727204142.97489: Calling groups_inventory to load vars for managed-node2 15980 1727204142.97493: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204142.97509: Calling all_plugins_play to load vars for managed-node2 15980 1727204142.97513: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204142.97516: Calling groups_plugins_play to load vars for managed-node2 15980 1727204142.97992: done sending task result for task 127b8e07-fff9-5f1d-4b72-0000000000b4 15980 1727204142.97996: WORKER PROCESS EXITING 15980 1727204142.98023: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204142.98234: done with get_vars() 15980 1727204142.98248: done getting variables 15980 1727204142.98316: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml:13 Tuesday 24 September 2024 14:55:42 -0400 (0:00:00.026) 0:00:04.393 ***** 15980 1727204142.98350: entering _queue_task() for managed-node2/set_fact 15980 1727204142.98679: worker is 1 (out of 1 available) 15980 1727204142.98692: exiting _queue_task() for managed-node2/set_fact 15980 1727204142.98706: done queuing things up, now waiting for results queue to drain 15980 1727204142.98708: waiting for pending results... 15980 1727204142.98993: running TaskExecutor() for managed-node2/TASK: Set network provider to 'nm' 15980 1727204142.99098: in run() - task 127b8e07-fff9-5f1d-4b72-000000000007 15980 1727204142.99173: variable 'ansible_search_path' from source: unknown 15980 1727204142.99177: calling self._execute() 15980 1727204142.99281: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204142.99295: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204142.99371: variable 'omit' from source: magic vars 15980 1727204142.99443: variable 'omit' from source: magic vars 15980 1727204142.99490: variable 'omit' from source: magic vars 15980 1727204142.99548: variable 'omit' from source: magic vars 15980 1727204142.99603: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204142.99659: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204142.99695: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204142.99720: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204142.99747: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204142.99785: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204142.99846: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204142.99849: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204142.99930: Set connection var ansible_connection to ssh 15980 1727204142.99949: Set connection var ansible_pipelining to False 15980 1727204142.99964: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204143.00172: Set connection var ansible_timeout to 10 15980 1727204143.00175: Set connection var ansible_shell_type to sh 15980 1727204143.00178: Set connection var ansible_shell_executable to /bin/sh 15980 1727204143.00180: variable 'ansible_shell_executable' from source: unknown 15980 1727204143.00182: variable 'ansible_connection' from source: unknown 15980 1727204143.00184: variable 'ansible_module_compression' from source: unknown 15980 1727204143.00186: variable 'ansible_shell_type' from source: unknown 15980 1727204143.00188: variable 'ansible_shell_executable' from source: unknown 15980 1727204143.00190: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204143.00192: variable 'ansible_pipelining' from source: unknown 15980 1727204143.00194: variable 'ansible_timeout' from source: unknown 15980 1727204143.00196: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204143.00250: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15980 1727204143.00273: variable 'omit' from source: magic vars 15980 1727204143.00285: starting attempt loop 15980 1727204143.00293: running the handler 15980 1727204143.00315: handler run complete 15980 1727204143.00334: attempt loop complete, returning result 15980 1727204143.00342: _execute() done 15980 1727204143.00349: dumping result to json 15980 1727204143.00358: done dumping result, returning 15980 1727204143.00374: done running TaskExecutor() for managed-node2/TASK: Set network provider to 'nm' [127b8e07-fff9-5f1d-4b72-000000000007] 15980 1727204143.00385: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000007 ok: [managed-node2] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 15980 1727204143.00591: no more pending results, returning what we have 15980 1727204143.00600: results queue empty 15980 1727204143.00602: checking for any_errors_fatal 15980 1727204143.00608: done checking for any_errors_fatal 15980 1727204143.00609: checking for max_fail_percentage 15980 1727204143.00611: done checking for max_fail_percentage 15980 1727204143.00612: checking to see if all hosts have failed and the running result is not ok 15980 1727204143.00613: done checking to see if all hosts have failed 15980 1727204143.00614: getting the remaining hosts for this loop 15980 1727204143.00616: done getting the remaining hosts for this loop 15980 1727204143.00621: getting the next task for host managed-node2 15980 1727204143.00631: done getting next task for host managed-node2 15980 1727204143.00634: ^ task is: TASK: meta (flush_handlers) 15980 1727204143.00636: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204143.00641: getting variables 15980 1727204143.00643: in VariableManager get_vars() 15980 1727204143.00678: Calling all_inventory to load vars for managed-node2 15980 1727204143.00681: Calling groups_inventory to load vars for managed-node2 15980 1727204143.00685: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204143.00699: Calling all_plugins_play to load vars for managed-node2 15980 1727204143.00702: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204143.00705: Calling groups_plugins_play to load vars for managed-node2 15980 1727204143.01191: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000007 15980 1727204143.01195: WORKER PROCESS EXITING 15980 1727204143.01223: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204143.01422: done with get_vars() 15980 1727204143.01435: done getting variables 15980 1727204143.01507: in VariableManager get_vars() 15980 1727204143.01517: Calling all_inventory to load vars for managed-node2 15980 1727204143.01519: Calling groups_inventory to load vars for managed-node2 15980 1727204143.01522: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204143.01527: Calling all_plugins_play to load vars for managed-node2 15980 1727204143.01531: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204143.01534: Calling groups_plugins_play to load vars for managed-node2 15980 1727204143.01678: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204143.01862: done with get_vars() 15980 1727204143.01879: done queuing things up, now waiting for results queue to drain 15980 1727204143.01881: results queue empty 15980 1727204143.01881: checking for any_errors_fatal 15980 1727204143.01884: done checking for any_errors_fatal 15980 1727204143.01885: checking for max_fail_percentage 15980 1727204143.01886: done checking for max_fail_percentage 15980 1727204143.01886: checking to see if all hosts have failed and the running result is not ok 15980 1727204143.01887: done checking to see if all hosts have failed 15980 1727204143.01888: getting the remaining hosts for this loop 15980 1727204143.01889: done getting the remaining hosts for this loop 15980 1727204143.01892: getting the next task for host managed-node2 15980 1727204143.01896: done getting next task for host managed-node2 15980 1727204143.01897: ^ task is: TASK: meta (flush_handlers) 15980 1727204143.01899: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204143.01907: getting variables 15980 1727204143.01908: in VariableManager get_vars() 15980 1727204143.01916: Calling all_inventory to load vars for managed-node2 15980 1727204143.01918: Calling groups_inventory to load vars for managed-node2 15980 1727204143.01920: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204143.01925: Calling all_plugins_play to load vars for managed-node2 15980 1727204143.01930: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204143.01933: Calling groups_plugins_play to load vars for managed-node2 15980 1727204143.02076: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204143.02460: done with get_vars() 15980 1727204143.02471: done getting variables 15980 1727204143.02520: in VariableManager get_vars() 15980 1727204143.02531: Calling all_inventory to load vars for managed-node2 15980 1727204143.02534: Calling groups_inventory to load vars for managed-node2 15980 1727204143.02536: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204143.02540: Calling all_plugins_play to load vars for managed-node2 15980 1727204143.02542: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204143.02545: Calling groups_plugins_play to load vars for managed-node2 15980 1727204143.02684: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204143.02869: done with get_vars() 15980 1727204143.02881: done queuing things up, now waiting for results queue to drain 15980 1727204143.02883: results queue empty 15980 1727204143.02884: checking for any_errors_fatal 15980 1727204143.02885: done checking for any_errors_fatal 15980 1727204143.02886: checking for max_fail_percentage 15980 1727204143.02887: done checking for max_fail_percentage 15980 1727204143.02888: checking to see if all hosts have failed and the running result is not ok 15980 1727204143.02889: done checking to see if all hosts have failed 15980 1727204143.02889: getting the remaining hosts for this loop 15980 1727204143.02890: done getting the remaining hosts for this loop 15980 1727204143.02893: getting the next task for host managed-node2 15980 1727204143.02897: done getting next task for host managed-node2 15980 1727204143.02897: ^ task is: None 15980 1727204143.02899: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204143.02900: done queuing things up, now waiting for results queue to drain 15980 1727204143.02901: results queue empty 15980 1727204143.02902: checking for any_errors_fatal 15980 1727204143.02903: done checking for any_errors_fatal 15980 1727204143.02903: checking for max_fail_percentage 15980 1727204143.02904: done checking for max_fail_percentage 15980 1727204143.02905: checking to see if all hosts have failed and the running result is not ok 15980 1727204143.02906: done checking to see if all hosts have failed 15980 1727204143.02908: getting the next task for host managed-node2 15980 1727204143.02910: done getting next task for host managed-node2 15980 1727204143.02911: ^ task is: None 15980 1727204143.02912: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204143.02969: in VariableManager get_vars() 15980 1727204143.02984: done with get_vars() 15980 1727204143.02990: in VariableManager get_vars() 15980 1727204143.03000: done with get_vars() 15980 1727204143.03004: variable 'omit' from source: magic vars 15980 1727204143.03041: in VariableManager get_vars() 15980 1727204143.03050: done with get_vars() 15980 1727204143.03073: variable 'omit' from source: magic vars PLAY [Test configuring bridges] ************************************************ 15980 1727204143.03276: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15980 1727204143.03302: getting the remaining hosts for this loop 15980 1727204143.03304: done getting the remaining hosts for this loop 15980 1727204143.03306: getting the next task for host managed-node2 15980 1727204143.03309: done getting next task for host managed-node2 15980 1727204143.03310: ^ task is: TASK: Gathering Facts 15980 1727204143.03312: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204143.03314: getting variables 15980 1727204143.03315: in VariableManager get_vars() 15980 1727204143.03322: Calling all_inventory to load vars for managed-node2 15980 1727204143.03324: Calling groups_inventory to load vars for managed-node2 15980 1727204143.03327: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204143.03335: Calling all_plugins_play to load vars for managed-node2 15980 1727204143.03348: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204143.03351: Calling groups_plugins_play to load vars for managed-node2 15980 1727204143.03497: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204143.03726: done with get_vars() 15980 1727204143.03736: done getting variables 15980 1727204143.03780: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:3 Tuesday 24 September 2024 14:55:43 -0400 (0:00:00.054) 0:00:04.448 ***** 15980 1727204143.03805: entering _queue_task() for managed-node2/gather_facts 15980 1727204143.04125: worker is 1 (out of 1 available) 15980 1727204143.04140: exiting _queue_task() for managed-node2/gather_facts 15980 1727204143.04153: done queuing things up, now waiting for results queue to drain 15980 1727204143.04154: waiting for pending results... 15980 1727204143.04425: running TaskExecutor() for managed-node2/TASK: Gathering Facts 15980 1727204143.04572: in run() - task 127b8e07-fff9-5f1d-4b72-0000000000da 15980 1727204143.04576: variable 'ansible_search_path' from source: unknown 15980 1727204143.04602: calling self._execute() 15980 1727204143.04771: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204143.04775: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204143.04778: variable 'omit' from source: magic vars 15980 1727204143.05154: variable 'ansible_distribution_major_version' from source: facts 15980 1727204143.05177: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204143.05196: variable 'omit' from source: magic vars 15980 1727204143.05235: variable 'omit' from source: magic vars 15980 1727204143.05283: variable 'omit' from source: magic vars 15980 1727204143.05337: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204143.05385: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204143.05417: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204143.05446: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204143.05519: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204143.05523: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204143.05526: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204143.05531: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204143.05639: Set connection var ansible_connection to ssh 15980 1727204143.05654: Set connection var ansible_pipelining to False 15980 1727204143.05667: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204143.05680: Set connection var ansible_timeout to 10 15980 1727204143.05690: Set connection var ansible_shell_type to sh 15980 1727204143.05701: Set connection var ansible_shell_executable to /bin/sh 15980 1727204143.05770: variable 'ansible_shell_executable' from source: unknown 15980 1727204143.05774: variable 'ansible_connection' from source: unknown 15980 1727204143.05776: variable 'ansible_module_compression' from source: unknown 15980 1727204143.05779: variable 'ansible_shell_type' from source: unknown 15980 1727204143.05782: variable 'ansible_shell_executable' from source: unknown 15980 1727204143.05784: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204143.05786: variable 'ansible_pipelining' from source: unknown 15980 1727204143.05789: variable 'ansible_timeout' from source: unknown 15980 1727204143.05791: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204143.06008: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15980 1727204143.06063: variable 'omit' from source: magic vars 15980 1727204143.06069: starting attempt loop 15980 1727204143.06072: running the handler 15980 1727204143.06074: variable 'ansible_facts' from source: unknown 15980 1727204143.06098: _low_level_execute_command(): starting 15980 1727204143.06112: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15980 1727204143.06999: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204143.07050: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204143.07075: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204143.07116: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204143.07227: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15980 1727204143.09642: stdout chunk (state=3): >>>/root <<< 15980 1727204143.09904: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204143.09909: stdout chunk (state=3): >>><<< 15980 1727204143.09911: stderr chunk (state=3): >>><<< 15980 1727204143.09935: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 15980 1727204143.09956: _low_level_execute_command(): starting 15980 1727204143.09976: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204143.099419-16586-16514555322794 `" && echo ansible-tmp-1727204143.099419-16586-16514555322794="` echo /root/.ansible/tmp/ansible-tmp-1727204143.099419-16586-16514555322794 `" ) && sleep 0' 15980 1727204143.10793: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204143.10842: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204143.10943: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15980 1727204143.13814: stdout chunk (state=3): >>>ansible-tmp-1727204143.099419-16586-16514555322794=/root/.ansible/tmp/ansible-tmp-1727204143.099419-16586-16514555322794 <<< 15980 1727204143.14093: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204143.14098: stdout chunk (state=3): >>><<< 15980 1727204143.14100: stderr chunk (state=3): >>><<< 15980 1727204143.14273: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204143.099419-16586-16514555322794=/root/.ansible/tmp/ansible-tmp-1727204143.099419-16586-16514555322794 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 15980 1727204143.14279: variable 'ansible_module_compression' from source: unknown 15980 1727204143.14282: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15980vtkmnsww/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15980 1727204143.14311: variable 'ansible_facts' from source: unknown 15980 1727204143.14542: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204143.099419-16586-16514555322794/AnsiballZ_setup.py 15980 1727204143.14747: Sending initial data 15980 1727204143.14756: Sent initial data (152 bytes) 15980 1727204143.15439: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204143.15483: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 15980 1727204143.15507: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204143.15611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204143.15640: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204143.15745: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15980 1727204143.18063: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15980 1727204143.18129: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15980 1727204143.18210: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15980vtkmnsww/tmpgru_no75 /root/.ansible/tmp/ansible-tmp-1727204143.099419-16586-16514555322794/AnsiballZ_setup.py <<< 15980 1727204143.18214: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204143.099419-16586-16514555322794/AnsiballZ_setup.py" <<< 15980 1727204143.18321: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15980vtkmnsww/tmpgru_no75" to remote "/root/.ansible/tmp/ansible-tmp-1727204143.099419-16586-16514555322794/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204143.099419-16586-16514555322794/AnsiballZ_setup.py" <<< 15980 1727204143.20128: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204143.20272: stderr chunk (state=3): >>><<< 15980 1727204143.20276: stdout chunk (state=3): >>><<< 15980 1727204143.20278: done transferring module to remote 15980 1727204143.20281: _low_level_execute_command(): starting 15980 1727204143.20284: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204143.099419-16586-16514555322794/ /root/.ansible/tmp/ansible-tmp-1727204143.099419-16586-16514555322794/AnsiballZ_setup.py && sleep 0' 15980 1727204143.20977: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204143.20995: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204143.21067: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204143.21125: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204143.21151: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204143.21189: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204143.21307: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15980 1727204143.24134: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204143.24139: stdout chunk (state=3): >>><<< 15980 1727204143.24141: stderr chunk (state=3): >>><<< 15980 1727204143.24144: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 15980 1727204143.24151: _low_level_execute_command(): starting 15980 1727204143.24154: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204143.099419-16586-16514555322794/AnsiballZ_setup.py && sleep 0' 15980 1727204143.25343: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204143.25508: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204143.25512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15980 1727204143.25527: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204143.25636: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204143.25701: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204143.25746: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204143.25925: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15980 1727204144.09670: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_is_chroot": false, "ansible_fibre_channel_wwn": [], "ansible_local": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "55", "second": "43", "epoch": "1727204143", "epoch_int": "1727204143", "date": "2024-09-24", "time": "14:55:43", "iso8601_micro": "2024-09-24T18:55:43.731162Z", "iso8601": "2024-09-24T18:55:43Z", "iso8601_basic": "20240924T145543731162", "iso8601_basic_short": "20240924T145543", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": [<<< 15980 1727204144.09860: stdout chunk (state=3): >>>"tty0", "ttyS0,115200n8"]}, "ansible_lsb": {}, "ansible_iscsi_iqn": "", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_loadavg": {"1m": 0.70458984375, "5m": 0.49462890625, "15m": 0.24169921875}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3034, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 682, "free": 3034}, "nocache": {"free": 3463, "used": 253}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_uuid": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 490, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251325689856, "block_size": 4096, "block_total": 64479564, "block_available": 61358811, "block_used": 3120753, "inode_total": 16384000, "inode_available": 16301508, "inode_used": 82492, "uui<<< 15980 1727204144.09884: stdout chunk (state=3): >>>d": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::f7:13ff:fe22:8fc1", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.73"], "ansible_all_ipv6_addresses": ["fe80::f7:13ff:fe22:8fc1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.73", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::f7:13ff:fe22:8fc1"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15980 1727204144.12491: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 15980 1727204144.12573: stderr chunk (state=3): >>><<< 15980 1727204144.12577: stdout chunk (state=3): >>><<< 15980 1727204144.12582: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_is_chroot": false, "ansible_fibre_channel_wwn": [], "ansible_local": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "55", "second": "43", "epoch": "1727204143", "epoch_int": "1727204143", "date": "2024-09-24", "time": "14:55:43", "iso8601_micro": "2024-09-24T18:55:43.731162Z", "iso8601": "2024-09-24T18:55:43Z", "iso8601_basic": "20240924T145543731162", "iso8601_basic_short": "20240924T145543", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_lsb": {}, "ansible_iscsi_iqn": "", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_loadavg": {"1m": 0.70458984375, "5m": 0.49462890625, "15m": 0.24169921875}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3034, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 682, "free": 3034}, "nocache": {"free": 3463, "used": 253}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_uuid": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 490, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251325689856, "block_size": 4096, "block_total": 64479564, "block_available": 61358811, "block_used": 3120753, "inode_total": 16384000, "inode_available": 16301508, "inode_used": 82492, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::f7:13ff:fe22:8fc1", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.73"], "ansible_all_ipv6_addresses": ["fe80::f7:13ff:fe22:8fc1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.73", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::f7:13ff:fe22:8fc1"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 15980 1727204144.13081: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204143.099419-16586-16514555322794/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15980 1727204144.13120: _low_level_execute_command(): starting 15980 1727204144.13212: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204143.099419-16586-16514555322794/ > /dev/null 2>&1 && sleep 0' 15980 1727204144.13917: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 15980 1727204144.14025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204144.14048: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204144.14091: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204144.14181: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15980 1727204144.17132: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204144.17183: stderr chunk (state=3): >>><<< 15980 1727204144.17187: stdout chunk (state=3): >>><<< 15980 1727204144.17231: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 15980 1727204144.17234: handler run complete 15980 1727204144.17481: variable 'ansible_facts' from source: unknown 15980 1727204144.17545: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204144.17912: variable 'ansible_facts' from source: unknown 15980 1727204144.18016: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204144.18164: attempt loop complete, returning result 15980 1727204144.18177: _execute() done 15980 1727204144.18184: dumping result to json 15980 1727204144.18217: done dumping result, returning 15980 1727204144.18234: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [127b8e07-fff9-5f1d-4b72-0000000000da] 15980 1727204144.18243: sending task result for task 127b8e07-fff9-5f1d-4b72-0000000000da ok: [managed-node2] 15980 1727204144.19219: no more pending results, returning what we have 15980 1727204144.19223: results queue empty 15980 1727204144.19224: checking for any_errors_fatal 15980 1727204144.19225: done checking for any_errors_fatal 15980 1727204144.19228: checking for max_fail_percentage 15980 1727204144.19229: done checking for max_fail_percentage 15980 1727204144.19230: checking to see if all hosts have failed and the running result is not ok 15980 1727204144.19231: done checking to see if all hosts have failed 15980 1727204144.19232: getting the remaining hosts for this loop 15980 1727204144.19233: done getting the remaining hosts for this loop 15980 1727204144.19237: getting the next task for host managed-node2 15980 1727204144.19242: done getting next task for host managed-node2 15980 1727204144.19244: ^ task is: TASK: meta (flush_handlers) 15980 1727204144.19246: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204144.19249: getting variables 15980 1727204144.19251: in VariableManager get_vars() 15980 1727204144.19304: Calling all_inventory to load vars for managed-node2 15980 1727204144.19307: Calling groups_inventory to load vars for managed-node2 15980 1727204144.19311: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204144.19320: done sending task result for task 127b8e07-fff9-5f1d-4b72-0000000000da 15980 1727204144.19323: WORKER PROCESS EXITING 15980 1727204144.19336: Calling all_plugins_play to load vars for managed-node2 15980 1727204144.19339: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204144.19343: Calling groups_plugins_play to load vars for managed-node2 15980 1727204144.19546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204144.19834: done with get_vars() 15980 1727204144.19848: done getting variables 15980 1727204144.19919: in VariableManager get_vars() 15980 1727204144.19932: Calling all_inventory to load vars for managed-node2 15980 1727204144.19935: Calling groups_inventory to load vars for managed-node2 15980 1727204144.19937: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204144.19942: Calling all_plugins_play to load vars for managed-node2 15980 1727204144.19944: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204144.19946: Calling groups_plugins_play to load vars for managed-node2 15980 1727204144.20079: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204144.20363: done with get_vars() 15980 1727204144.20888: done queuing things up, now waiting for results queue to drain 15980 1727204144.20891: results queue empty 15980 1727204144.20892: checking for any_errors_fatal 15980 1727204144.20897: done checking for any_errors_fatal 15980 1727204144.20898: checking for max_fail_percentage 15980 1727204144.20899: done checking for max_fail_percentage 15980 1727204144.20900: checking to see if all hosts have failed and the running result is not ok 15980 1727204144.20901: done checking to see if all hosts have failed 15980 1727204144.20906: getting the remaining hosts for this loop 15980 1727204144.20907: done getting the remaining hosts for this loop 15980 1727204144.20911: getting the next task for host managed-node2 15980 1727204144.20915: done getting next task for host managed-node2 15980 1727204144.20918: ^ task is: TASK: Set interface={{ interface }} 15980 1727204144.20920: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204144.20922: getting variables 15980 1727204144.20923: in VariableManager get_vars() 15980 1727204144.20937: Calling all_inventory to load vars for managed-node2 15980 1727204144.20940: Calling groups_inventory to load vars for managed-node2 15980 1727204144.20942: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204144.20949: Calling all_plugins_play to load vars for managed-node2 15980 1727204144.20951: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204144.20955: Calling groups_plugins_play to load vars for managed-node2 15980 1727204144.21521: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204144.22154: done with get_vars() 15980 1727204144.22371: done getting variables 15980 1727204144.22430: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15980 1727204144.22779: variable 'interface' from source: play vars TASK [Set interface=LSR-TST-br31] ********************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:9 Tuesday 24 September 2024 14:55:44 -0400 (0:00:01.190) 0:00:05.638 ***** 15980 1727204144.22825: entering _queue_task() for managed-node2/set_fact 15980 1727204144.24309: worker is 1 (out of 1 available) 15980 1727204144.24322: exiting _queue_task() for managed-node2/set_fact 15980 1727204144.24336: done queuing things up, now waiting for results queue to drain 15980 1727204144.24338: waiting for pending results... 15980 1727204144.24889: running TaskExecutor() for managed-node2/TASK: Set interface=LSR-TST-br31 15980 1727204144.25120: in run() - task 127b8e07-fff9-5f1d-4b72-00000000000b 15980 1727204144.25228: variable 'ansible_search_path' from source: unknown 15980 1727204144.25255: calling self._execute() 15980 1727204144.25552: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204144.25596: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204144.25631: variable 'omit' from source: magic vars 15980 1727204144.26655: variable 'ansible_distribution_major_version' from source: facts 15980 1727204144.26749: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204144.26758: variable 'omit' from source: magic vars 15980 1727204144.26790: variable 'omit' from source: magic vars 15980 1727204144.26903: variable 'interface' from source: play vars 15980 1727204144.27105: variable 'interface' from source: play vars 15980 1727204144.27157: variable 'omit' from source: magic vars 15980 1727204144.27292: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204144.27401: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204144.27509: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204144.27512: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204144.27520: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204144.27618: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204144.27623: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204144.27626: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204144.27925: Set connection var ansible_connection to ssh 15980 1727204144.27928: Set connection var ansible_pipelining to False 15980 1727204144.27946: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204144.28178: Set connection var ansible_timeout to 10 15980 1727204144.28181: Set connection var ansible_shell_type to sh 15980 1727204144.28183: Set connection var ansible_shell_executable to /bin/sh 15980 1727204144.28185: variable 'ansible_shell_executable' from source: unknown 15980 1727204144.28188: variable 'ansible_connection' from source: unknown 15980 1727204144.28190: variable 'ansible_module_compression' from source: unknown 15980 1727204144.28192: variable 'ansible_shell_type' from source: unknown 15980 1727204144.28194: variable 'ansible_shell_executable' from source: unknown 15980 1727204144.28196: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204144.28197: variable 'ansible_pipelining' from source: unknown 15980 1727204144.28199: variable 'ansible_timeout' from source: unknown 15980 1727204144.28201: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204144.28544: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15980 1727204144.28632: variable 'omit' from source: magic vars 15980 1727204144.28724: starting attempt loop 15980 1727204144.28727: running the handler 15980 1727204144.28730: handler run complete 15980 1727204144.28732: attempt loop complete, returning result 15980 1727204144.28735: _execute() done 15980 1727204144.28736: dumping result to json 15980 1727204144.28738: done dumping result, returning 15980 1727204144.28740: done running TaskExecutor() for managed-node2/TASK: Set interface=LSR-TST-br31 [127b8e07-fff9-5f1d-4b72-00000000000b] 15980 1727204144.28743: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000000b 15980 1727204144.29052: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000000b 15980 1727204144.29057: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "interface": "LSR-TST-br31" }, "changed": false } 15980 1727204144.29129: no more pending results, returning what we have 15980 1727204144.29132: results queue empty 15980 1727204144.29133: checking for any_errors_fatal 15980 1727204144.29135: done checking for any_errors_fatal 15980 1727204144.29136: checking for max_fail_percentage 15980 1727204144.29138: done checking for max_fail_percentage 15980 1727204144.29138: checking to see if all hosts have failed and the running result is not ok 15980 1727204144.29140: done checking to see if all hosts have failed 15980 1727204144.29141: getting the remaining hosts for this loop 15980 1727204144.29143: done getting the remaining hosts for this loop 15980 1727204144.29147: getting the next task for host managed-node2 15980 1727204144.29153: done getting next task for host managed-node2 15980 1727204144.29157: ^ task is: TASK: Include the task 'show_interfaces.yml' 15980 1727204144.29159: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204144.29164: getting variables 15980 1727204144.29167: in VariableManager get_vars() 15980 1727204144.29198: Calling all_inventory to load vars for managed-node2 15980 1727204144.29201: Calling groups_inventory to load vars for managed-node2 15980 1727204144.29205: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204144.29218: Calling all_plugins_play to load vars for managed-node2 15980 1727204144.29220: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204144.29223: Calling groups_plugins_play to load vars for managed-node2 15980 1727204144.29740: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204144.30774: done with get_vars() 15980 1727204144.30792: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:12 Tuesday 24 September 2024 14:55:44 -0400 (0:00:00.080) 0:00:05.719 ***** 15980 1727204144.30902: entering _queue_task() for managed-node2/include_tasks 15980 1727204144.31461: worker is 1 (out of 1 available) 15980 1727204144.31479: exiting _queue_task() for managed-node2/include_tasks 15980 1727204144.31493: done queuing things up, now waiting for results queue to drain 15980 1727204144.31496: waiting for pending results... 15980 1727204144.32086: running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' 15980 1727204144.32283: in run() - task 127b8e07-fff9-5f1d-4b72-00000000000c 15980 1727204144.32303: variable 'ansible_search_path' from source: unknown 15980 1727204144.32368: calling self._execute() 15980 1727204144.32522: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204144.32693: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204144.32774: variable 'omit' from source: magic vars 15980 1727204144.35334: variable 'ansible_distribution_major_version' from source: facts 15980 1727204144.35806: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204144.35810: _execute() done 15980 1727204144.35813: dumping result to json 15980 1727204144.35816: done dumping result, returning 15980 1727204144.35818: done running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' [127b8e07-fff9-5f1d-4b72-00000000000c] 15980 1727204144.35820: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000000c 15980 1727204144.36226: no more pending results, returning what we have 15980 1727204144.36233: in VariableManager get_vars() 15980 1727204144.36273: Calling all_inventory to load vars for managed-node2 15980 1727204144.36277: Calling groups_inventory to load vars for managed-node2 15980 1727204144.36281: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204144.36298: Calling all_plugins_play to load vars for managed-node2 15980 1727204144.36300: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204144.36303: Calling groups_plugins_play to load vars for managed-node2 15980 1727204144.36840: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000000c 15980 1727204144.36845: WORKER PROCESS EXITING 15980 1727204144.36950: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204144.37254: done with get_vars() 15980 1727204144.37564: variable 'ansible_search_path' from source: unknown 15980 1727204144.37585: we have included files to process 15980 1727204144.37586: generating all_blocks data 15980 1727204144.37588: done generating all_blocks data 15980 1727204144.37588: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 15980 1727204144.37590: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 15980 1727204144.37592: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 15980 1727204144.37904: in VariableManager get_vars() 15980 1727204144.37950: done with get_vars() 15980 1727204144.38075: done processing included file 15980 1727204144.38078: iterating over new_blocks loaded from include file 15980 1727204144.38080: in VariableManager get_vars() 15980 1727204144.38093: done with get_vars() 15980 1727204144.38095: filtering new block on tags 15980 1727204144.38114: done filtering new block on tags 15980 1727204144.38116: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node2 15980 1727204144.38122: extending task lists for all hosts with included blocks 15980 1727204144.38504: done extending task lists 15980 1727204144.38506: done processing included files 15980 1727204144.38506: results queue empty 15980 1727204144.38507: checking for any_errors_fatal 15980 1727204144.38512: done checking for any_errors_fatal 15980 1727204144.38513: checking for max_fail_percentage 15980 1727204144.38514: done checking for max_fail_percentage 15980 1727204144.38515: checking to see if all hosts have failed and the running result is not ok 15980 1727204144.38516: done checking to see if all hosts have failed 15980 1727204144.38517: getting the remaining hosts for this loop 15980 1727204144.38518: done getting the remaining hosts for this loop 15980 1727204144.38520: getting the next task for host managed-node2 15980 1727204144.38525: done getting next task for host managed-node2 15980 1727204144.38527: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 15980 1727204144.38530: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204144.38533: getting variables 15980 1727204144.38534: in VariableManager get_vars() 15980 1727204144.38545: Calling all_inventory to load vars for managed-node2 15980 1727204144.38547: Calling groups_inventory to load vars for managed-node2 15980 1727204144.38549: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204144.38555: Calling all_plugins_play to load vars for managed-node2 15980 1727204144.38558: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204144.38561: Calling groups_plugins_play to load vars for managed-node2 15980 1727204144.39022: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204144.39433: done with get_vars() 15980 1727204144.39446: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 14:55:44 -0400 (0:00:00.086) 0:00:05.805 ***** 15980 1727204144.39534: entering _queue_task() for managed-node2/include_tasks 15980 1727204144.40336: worker is 1 (out of 1 available) 15980 1727204144.40351: exiting _queue_task() for managed-node2/include_tasks 15980 1727204144.40365: done queuing things up, now waiting for results queue to drain 15980 1727204144.40368: waiting for pending results... 15980 1727204144.40837: running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' 15980 1727204144.41084: in run() - task 127b8e07-fff9-5f1d-4b72-0000000000ee 15980 1727204144.41376: variable 'ansible_search_path' from source: unknown 15980 1727204144.41441: variable 'ansible_search_path' from source: unknown 15980 1727204144.41779: calling self._execute() 15980 1727204144.42336: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204144.42341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204144.42351: variable 'omit' from source: magic vars 15980 1727204144.44317: variable 'ansible_distribution_major_version' from source: facts 15980 1727204144.44321: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204144.44324: _execute() done 15980 1727204144.44329: dumping result to json 15980 1727204144.44331: done dumping result, returning 15980 1727204144.44334: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' [127b8e07-fff9-5f1d-4b72-0000000000ee] 15980 1727204144.44336: sending task result for task 127b8e07-fff9-5f1d-4b72-0000000000ee 15980 1727204144.44793: done sending task result for task 127b8e07-fff9-5f1d-4b72-0000000000ee 15980 1727204144.44800: WORKER PROCESS EXITING 15980 1727204144.44835: no more pending results, returning what we have 15980 1727204144.44841: in VariableManager get_vars() 15980 1727204144.44884: Calling all_inventory to load vars for managed-node2 15980 1727204144.44888: Calling groups_inventory to load vars for managed-node2 15980 1727204144.44892: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204144.44909: Calling all_plugins_play to load vars for managed-node2 15980 1727204144.44912: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204144.44916: Calling groups_plugins_play to load vars for managed-node2 15980 1727204144.45332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204144.46003: done with get_vars() 15980 1727204144.46015: variable 'ansible_search_path' from source: unknown 15980 1727204144.46016: variable 'ansible_search_path' from source: unknown 15980 1727204144.46064: we have included files to process 15980 1727204144.46469: generating all_blocks data 15980 1727204144.46473: done generating all_blocks data 15980 1727204144.46474: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 15980 1727204144.46476: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 15980 1727204144.46480: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 15980 1727204144.47417: done processing included file 15980 1727204144.47420: iterating over new_blocks loaded from include file 15980 1727204144.47422: in VariableManager get_vars() 15980 1727204144.47530: done with get_vars() 15980 1727204144.47532: filtering new block on tags 15980 1727204144.47555: done filtering new block on tags 15980 1727204144.47557: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node2 15980 1727204144.47563: extending task lists for all hosts with included blocks 15980 1727204144.48189: done extending task lists 15980 1727204144.48190: done processing included files 15980 1727204144.48191: results queue empty 15980 1727204144.48192: checking for any_errors_fatal 15980 1727204144.48195: done checking for any_errors_fatal 15980 1727204144.48196: checking for max_fail_percentage 15980 1727204144.48198: done checking for max_fail_percentage 15980 1727204144.48198: checking to see if all hosts have failed and the running result is not ok 15980 1727204144.48200: done checking to see if all hosts have failed 15980 1727204144.48200: getting the remaining hosts for this loop 15980 1727204144.48202: done getting the remaining hosts for this loop 15980 1727204144.48204: getting the next task for host managed-node2 15980 1727204144.48209: done getting next task for host managed-node2 15980 1727204144.48211: ^ task is: TASK: Gather current interface info 15980 1727204144.48215: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204144.48217: getting variables 15980 1727204144.48219: in VariableManager get_vars() 15980 1727204144.48232: Calling all_inventory to load vars for managed-node2 15980 1727204144.48235: Calling groups_inventory to load vars for managed-node2 15980 1727204144.48237: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204144.48244: Calling all_plugins_play to load vars for managed-node2 15980 1727204144.48246: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204144.48249: Calling groups_plugins_play to load vars for managed-node2 15980 1727204144.48563: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204144.49153: done with get_vars() 15980 1727204144.49182: done getting variables 15980 1727204144.49510: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 14:55:44 -0400 (0:00:00.100) 0:00:05.905 ***** 15980 1727204144.49545: entering _queue_task() for managed-node2/command 15980 1727204144.50407: worker is 1 (out of 1 available) 15980 1727204144.50551: exiting _queue_task() for managed-node2/command 15980 1727204144.50566: done queuing things up, now waiting for results queue to drain 15980 1727204144.50693: waiting for pending results... 15980 1727204144.51402: running TaskExecutor() for managed-node2/TASK: Gather current interface info 15980 1727204144.51409: in run() - task 127b8e07-fff9-5f1d-4b72-0000000000fd 15980 1727204144.51412: variable 'ansible_search_path' from source: unknown 15980 1727204144.51415: variable 'ansible_search_path' from source: unknown 15980 1727204144.51476: calling self._execute() 15980 1727204144.51682: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204144.51840: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204144.52051: variable 'omit' from source: magic vars 15980 1727204144.53002: variable 'ansible_distribution_major_version' from source: facts 15980 1727204144.53172: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204144.53202: variable 'omit' from source: magic vars 15980 1727204144.53397: variable 'omit' from source: magic vars 15980 1727204144.53484: variable 'omit' from source: magic vars 15980 1727204144.53631: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204144.53681: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204144.53907: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204144.53911: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204144.53913: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204144.53916: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204144.53918: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204144.53921: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204144.54089: Set connection var ansible_connection to ssh 15980 1727204144.54184: Set connection var ansible_pipelining to False 15980 1727204144.54197: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204144.54208: Set connection var ansible_timeout to 10 15980 1727204144.54219: Set connection var ansible_shell_type to sh 15980 1727204144.54242: Set connection var ansible_shell_executable to /bin/sh 15980 1727204144.54448: variable 'ansible_shell_executable' from source: unknown 15980 1727204144.54451: variable 'ansible_connection' from source: unknown 15980 1727204144.54454: variable 'ansible_module_compression' from source: unknown 15980 1727204144.54456: variable 'ansible_shell_type' from source: unknown 15980 1727204144.54458: variable 'ansible_shell_executable' from source: unknown 15980 1727204144.54460: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204144.54462: variable 'ansible_pipelining' from source: unknown 15980 1727204144.54472: variable 'ansible_timeout' from source: unknown 15980 1727204144.54480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204144.54900: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15980 1727204144.54992: variable 'omit' from source: magic vars 15980 1727204144.55016: starting attempt loop 15980 1727204144.55024: running the handler 15980 1727204144.55048: _low_level_execute_command(): starting 15980 1727204144.55095: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15980 1727204144.57741: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15980 1727204144.57762: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204144.58083: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204144.58147: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204144.58316: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204144.58685: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204144.60262: stdout chunk (state=3): >>>/root <<< 15980 1727204144.60535: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204144.60580: stderr chunk (state=3): >>><<< 15980 1727204144.60584: stdout chunk (state=3): >>><<< 15980 1727204144.60617: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204144.60863: _low_level_execute_command(): starting 15980 1727204144.60870: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204144.6073098-16708-62330187317994 `" && echo ansible-tmp-1727204144.6073098-16708-62330187317994="` echo /root/.ansible/tmp/ansible-tmp-1727204144.6073098-16708-62330187317994 `" ) && sleep 0' 15980 1727204144.62606: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204144.62824: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204144.62852: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204144.62969: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204144.64992: stdout chunk (state=3): >>>ansible-tmp-1727204144.6073098-16708-62330187317994=/root/.ansible/tmp/ansible-tmp-1727204144.6073098-16708-62330187317994 <<< 15980 1727204144.65255: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204144.65286: stdout chunk (state=3): >>><<< 15980 1727204144.65289: stderr chunk (state=3): >>><<< 15980 1727204144.65384: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204144.6073098-16708-62330187317994=/root/.ansible/tmp/ansible-tmp-1727204144.6073098-16708-62330187317994 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204144.65591: variable 'ansible_module_compression' from source: unknown 15980 1727204144.65625: ANSIBALLZ: Using generic lock for ansible.legacy.command 15980 1727204144.65675: ANSIBALLZ: Acquiring lock 15980 1727204144.65684: ANSIBALLZ: Lock acquired: 139981197612416 15980 1727204144.65695: ANSIBALLZ: Creating module 15980 1727204145.06607: ANSIBALLZ: Writing module into payload 15980 1727204145.06759: ANSIBALLZ: Writing module 15980 1727204145.06797: ANSIBALLZ: Renaming module 15980 1727204145.06876: ANSIBALLZ: Done creating module 15980 1727204145.06891: variable 'ansible_facts' from source: unknown 15980 1727204145.06978: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204144.6073098-16708-62330187317994/AnsiballZ_command.py 15980 1727204145.07294: Sending initial data 15980 1727204145.07297: Sent initial data (155 bytes) 15980 1727204145.08325: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204145.08341: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204145.08441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204145.08488: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204145.08564: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204145.10303: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15980 1727204145.10395: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15980 1727204145.10481: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15980vtkmnsww/tmp331xnvq_ /root/.ansible/tmp/ansible-tmp-1727204144.6073098-16708-62330187317994/AnsiballZ_command.py <<< 15980 1727204145.10497: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204144.6073098-16708-62330187317994/AnsiballZ_command.py" <<< 15980 1727204145.10582: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15980vtkmnsww/tmp331xnvq_" to remote "/root/.ansible/tmp/ansible-tmp-1727204144.6073098-16708-62330187317994/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204144.6073098-16708-62330187317994/AnsiballZ_command.py" <<< 15980 1727204145.11561: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204145.11605: stderr chunk (state=3): >>><<< 15980 1727204145.11653: stdout chunk (state=3): >>><<< 15980 1727204145.11691: done transferring module to remote 15980 1727204145.11771: _low_level_execute_command(): starting 15980 1727204145.11775: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204144.6073098-16708-62330187317994/ /root/.ansible/tmp/ansible-tmp-1727204144.6073098-16708-62330187317994/AnsiballZ_command.py && sleep 0' 15980 1727204145.12496: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204145.12529: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204145.12583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204145.12668: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204145.12687: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204145.12742: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204145.12987: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204145.14943: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204145.15360: stderr chunk (state=3): >>><<< 15980 1727204145.15364: stdout chunk (state=3): >>><<< 15980 1727204145.15370: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204145.15373: _low_level_execute_command(): starting 15980 1727204145.15376: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204144.6073098-16708-62330187317994/AnsiballZ_command.py && sleep 0' 15980 1727204145.16498: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204145.16788: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204145.17041: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204145.17339: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204145.34295: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:55:45.337882", "end": "2024-09-24 14:55:45.341503", "delta": "0:00:00.003621", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15980 1727204145.36103: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 15980 1727204145.36207: stderr chunk (state=3): >>><<< 15980 1727204145.36211: stdout chunk (state=3): >>><<< 15980 1727204145.36241: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:55:45.337882", "end": "2024-09-24 14:55:45.341503", "delta": "0:00:00.003621", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 15980 1727204145.36316: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204144.6073098-16708-62330187317994/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15980 1727204145.36571: _low_level_execute_command(): starting 15980 1727204145.36574: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204144.6073098-16708-62330187317994/ > /dev/null 2>&1 && sleep 0' 15980 1727204145.37747: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204145.37752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204145.37878: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204145.37885: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204145.38102: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204145.38180: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204145.40193: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204145.40198: stderr chunk (state=3): >>><<< 15980 1727204145.40202: stdout chunk (state=3): >>><<< 15980 1727204145.40240: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204145.40246: handler run complete 15980 1727204145.40336: Evaluated conditional (False): False 15980 1727204145.40350: attempt loop complete, returning result 15980 1727204145.40353: _execute() done 15980 1727204145.40356: dumping result to json 15980 1727204145.40362: done dumping result, returning 15980 1727204145.40373: done running TaskExecutor() for managed-node2/TASK: Gather current interface info [127b8e07-fff9-5f1d-4b72-0000000000fd] 15980 1727204145.40378: sending task result for task 127b8e07-fff9-5f1d-4b72-0000000000fd 15980 1727204145.40498: done sending task result for task 127b8e07-fff9-5f1d-4b72-0000000000fd 15980 1727204145.40501: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003621", "end": "2024-09-24 14:55:45.341503", "rc": 0, "start": "2024-09-24 14:55:45.337882" } STDOUT: bonding_masters eth0 lo 15980 1727204145.40655: no more pending results, returning what we have 15980 1727204145.40659: results queue empty 15980 1727204145.40660: checking for any_errors_fatal 15980 1727204145.40661: done checking for any_errors_fatal 15980 1727204145.40662: checking for max_fail_percentage 15980 1727204145.40663: done checking for max_fail_percentage 15980 1727204145.40664: checking to see if all hosts have failed and the running result is not ok 15980 1727204145.40667: done checking to see if all hosts have failed 15980 1727204145.40668: getting the remaining hosts for this loop 15980 1727204145.40670: done getting the remaining hosts for this loop 15980 1727204145.40674: getting the next task for host managed-node2 15980 1727204145.40682: done getting next task for host managed-node2 15980 1727204145.40685: ^ task is: TASK: Set current_interfaces 15980 1727204145.40688: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204145.40692: getting variables 15980 1727204145.40694: in VariableManager get_vars() 15980 1727204145.40730: Calling all_inventory to load vars for managed-node2 15980 1727204145.40734: Calling groups_inventory to load vars for managed-node2 15980 1727204145.40738: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204145.40752: Calling all_plugins_play to load vars for managed-node2 15980 1727204145.40756: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204145.40760: Calling groups_plugins_play to load vars for managed-node2 15980 1727204145.41892: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204145.42635: done with get_vars() 15980 1727204145.42723: done getting variables 15980 1727204145.42918: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 14:55:45 -0400 (0:00:00.935) 0:00:06.841 ***** 15980 1727204145.43076: entering _queue_task() for managed-node2/set_fact 15980 1727204145.44367: worker is 1 (out of 1 available) 15980 1727204145.44376: exiting _queue_task() for managed-node2/set_fact 15980 1727204145.44386: done queuing things up, now waiting for results queue to drain 15980 1727204145.44388: waiting for pending results... 15980 1727204145.44477: running TaskExecutor() for managed-node2/TASK: Set current_interfaces 15980 1727204145.44484: in run() - task 127b8e07-fff9-5f1d-4b72-0000000000fe 15980 1727204145.44487: variable 'ansible_search_path' from source: unknown 15980 1727204145.44490: variable 'ansible_search_path' from source: unknown 15980 1727204145.44552: calling self._execute() 15980 1727204145.44687: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204145.44707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204145.44942: variable 'omit' from source: magic vars 15980 1727204145.45217: variable 'ansible_distribution_major_version' from source: facts 15980 1727204145.45240: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204145.45253: variable 'omit' from source: magic vars 15980 1727204145.45313: variable 'omit' from source: magic vars 15980 1727204145.45540: variable '_current_interfaces' from source: set_fact 15980 1727204145.45617: variable 'omit' from source: magic vars 15980 1727204145.45662: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204145.45711: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204145.45739: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204145.45767: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204145.45786: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204145.45822: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204145.45832: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204145.45839: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204145.45956: Set connection var ansible_connection to ssh 15980 1727204145.45973: Set connection var ansible_pipelining to False 15980 1727204145.45986: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204145.45997: Set connection var ansible_timeout to 10 15980 1727204145.46007: Set connection var ansible_shell_type to sh 15980 1727204145.46019: Set connection var ansible_shell_executable to /bin/sh 15980 1727204145.46058: variable 'ansible_shell_executable' from source: unknown 15980 1727204145.46068: variable 'ansible_connection' from source: unknown 15980 1727204145.46080: variable 'ansible_module_compression' from source: unknown 15980 1727204145.46087: variable 'ansible_shell_type' from source: unknown 15980 1727204145.46093: variable 'ansible_shell_executable' from source: unknown 15980 1727204145.46100: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204145.46108: variable 'ansible_pipelining' from source: unknown 15980 1727204145.46114: variable 'ansible_timeout' from source: unknown 15980 1727204145.46121: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204145.46524: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15980 1727204145.46528: variable 'omit' from source: magic vars 15980 1727204145.46530: starting attempt loop 15980 1727204145.46532: running the handler 15980 1727204145.46633: handler run complete 15980 1727204145.46955: attempt loop complete, returning result 15980 1727204145.46958: _execute() done 15980 1727204145.46960: dumping result to json 15980 1727204145.46962: done dumping result, returning 15980 1727204145.46964: done running TaskExecutor() for managed-node2/TASK: Set current_interfaces [127b8e07-fff9-5f1d-4b72-0000000000fe] 15980 1727204145.46969: sending task result for task 127b8e07-fff9-5f1d-4b72-0000000000fe 15980 1727204145.47042: done sending task result for task 127b8e07-fff9-5f1d-4b72-0000000000fe 15980 1727204145.47045: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 15980 1727204145.47231: no more pending results, returning what we have 15980 1727204145.47234: results queue empty 15980 1727204145.47235: checking for any_errors_fatal 15980 1727204145.47243: done checking for any_errors_fatal 15980 1727204145.47244: checking for max_fail_percentage 15980 1727204145.47245: done checking for max_fail_percentage 15980 1727204145.47246: checking to see if all hosts have failed and the running result is not ok 15980 1727204145.47247: done checking to see if all hosts have failed 15980 1727204145.47248: getting the remaining hosts for this loop 15980 1727204145.47250: done getting the remaining hosts for this loop 15980 1727204145.47255: getting the next task for host managed-node2 15980 1727204145.47263: done getting next task for host managed-node2 15980 1727204145.47268: ^ task is: TASK: Show current_interfaces 15980 1727204145.47270: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204145.47275: getting variables 15980 1727204145.47276: in VariableManager get_vars() 15980 1727204145.47308: Calling all_inventory to load vars for managed-node2 15980 1727204145.47311: Calling groups_inventory to load vars for managed-node2 15980 1727204145.47315: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204145.47328: Calling all_plugins_play to load vars for managed-node2 15980 1727204145.47332: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204145.47335: Calling groups_plugins_play to load vars for managed-node2 15980 1727204145.47972: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204145.48417: done with get_vars() 15980 1727204145.48431: done getting variables 15980 1727204145.48721: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 14:55:45 -0400 (0:00:00.059) 0:00:06.900 ***** 15980 1727204145.49030: entering _queue_task() for managed-node2/debug 15980 1727204145.49033: Creating lock for debug 15980 1727204145.49991: worker is 1 (out of 1 available) 15980 1727204145.50004: exiting _queue_task() for managed-node2/debug 15980 1727204145.50015: done queuing things up, now waiting for results queue to drain 15980 1727204145.50017: waiting for pending results... 15980 1727204145.50502: running TaskExecutor() for managed-node2/TASK: Show current_interfaces 15980 1727204145.50700: in run() - task 127b8e07-fff9-5f1d-4b72-0000000000ef 15980 1727204145.50706: variable 'ansible_search_path' from source: unknown 15980 1727204145.50709: variable 'ansible_search_path' from source: unknown 15980 1727204145.50774: calling self._execute() 15980 1727204145.50950: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204145.50963: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204145.50980: variable 'omit' from source: magic vars 15980 1727204145.51499: variable 'ansible_distribution_major_version' from source: facts 15980 1727204145.51519: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204145.51531: variable 'omit' from source: magic vars 15980 1727204145.51589: variable 'omit' from source: magic vars 15980 1727204145.51719: variable 'current_interfaces' from source: set_fact 15980 1727204145.51756: variable 'omit' from source: magic vars 15980 1727204145.51894: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204145.51898: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204145.51901: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204145.51911: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204145.51929: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204145.51969: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204145.51980: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204145.51989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204145.52116: Set connection var ansible_connection to ssh 15980 1727204145.52131: Set connection var ansible_pipelining to False 15980 1727204145.52143: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204145.52155: Set connection var ansible_timeout to 10 15980 1727204145.52168: Set connection var ansible_shell_type to sh 15980 1727204145.52180: Set connection var ansible_shell_executable to /bin/sh 15980 1727204145.52223: variable 'ansible_shell_executable' from source: unknown 15980 1727204145.52233: variable 'ansible_connection' from source: unknown 15980 1727204145.52241: variable 'ansible_module_compression' from source: unknown 15980 1727204145.52248: variable 'ansible_shell_type' from source: unknown 15980 1727204145.52329: variable 'ansible_shell_executable' from source: unknown 15980 1727204145.52333: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204145.52336: variable 'ansible_pipelining' from source: unknown 15980 1727204145.52338: variable 'ansible_timeout' from source: unknown 15980 1727204145.52341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204145.52456: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15980 1727204145.52518: variable 'omit' from source: magic vars 15980 1727204145.52528: starting attempt loop 15980 1727204145.52535: running the handler 15980 1727204145.52598: handler run complete 15980 1727204145.52628: attempt loop complete, returning result 15980 1727204145.52664: _execute() done 15980 1727204145.52682: dumping result to json 15980 1727204145.52967: done dumping result, returning 15980 1727204145.52971: done running TaskExecutor() for managed-node2/TASK: Show current_interfaces [127b8e07-fff9-5f1d-4b72-0000000000ef] 15980 1727204145.52974: sending task result for task 127b8e07-fff9-5f1d-4b72-0000000000ef 15980 1727204145.53052: done sending task result for task 127b8e07-fff9-5f1d-4b72-0000000000ef 15980 1727204145.53055: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 15980 1727204145.53114: no more pending results, returning what we have 15980 1727204145.53118: results queue empty 15980 1727204145.53119: checking for any_errors_fatal 15980 1727204145.53126: done checking for any_errors_fatal 15980 1727204145.53126: checking for max_fail_percentage 15980 1727204145.53128: done checking for max_fail_percentage 15980 1727204145.53129: checking to see if all hosts have failed and the running result is not ok 15980 1727204145.53130: done checking to see if all hosts have failed 15980 1727204145.53131: getting the remaining hosts for this loop 15980 1727204145.53133: done getting the remaining hosts for this loop 15980 1727204145.53138: getting the next task for host managed-node2 15980 1727204145.53147: done getting next task for host managed-node2 15980 1727204145.53151: ^ task is: TASK: Include the task 'assert_device_absent.yml' 15980 1727204145.53153: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204145.53158: getting variables 15980 1727204145.53160: in VariableManager get_vars() 15980 1727204145.53195: Calling all_inventory to load vars for managed-node2 15980 1727204145.53199: Calling groups_inventory to load vars for managed-node2 15980 1727204145.53203: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204145.53219: Calling all_plugins_play to load vars for managed-node2 15980 1727204145.53223: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204145.53226: Calling groups_plugins_play to load vars for managed-node2 15980 1727204145.54101: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204145.54309: done with get_vars() 15980 1727204145.54321: done getting variables TASK [Include the task 'assert_device_absent.yml'] ***************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:14 Tuesday 24 September 2024 14:55:45 -0400 (0:00:00.053) 0:00:06.954 ***** 15980 1727204145.54428: entering _queue_task() for managed-node2/include_tasks 15980 1727204145.55282: worker is 1 (out of 1 available) 15980 1727204145.55294: exiting _queue_task() for managed-node2/include_tasks 15980 1727204145.55305: done queuing things up, now waiting for results queue to drain 15980 1727204145.55307: waiting for pending results... 15980 1727204145.55833: running TaskExecutor() for managed-node2/TASK: Include the task 'assert_device_absent.yml' 15980 1727204145.55840: in run() - task 127b8e07-fff9-5f1d-4b72-00000000000d 15980 1727204145.55854: variable 'ansible_search_path' from source: unknown 15980 1727204145.55972: calling self._execute() 15980 1727204145.56177: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204145.56180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204145.56183: variable 'omit' from source: magic vars 15980 1727204145.56594: variable 'ansible_distribution_major_version' from source: facts 15980 1727204145.56615: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204145.56627: _execute() done 15980 1727204145.56634: dumping result to json 15980 1727204145.56641: done dumping result, returning 15980 1727204145.56651: done running TaskExecutor() for managed-node2/TASK: Include the task 'assert_device_absent.yml' [127b8e07-fff9-5f1d-4b72-00000000000d] 15980 1727204145.56660: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000000d 15980 1727204145.56819: no more pending results, returning what we have 15980 1727204145.56825: in VariableManager get_vars() 15980 1727204145.56867: Calling all_inventory to load vars for managed-node2 15980 1727204145.56871: Calling groups_inventory to load vars for managed-node2 15980 1727204145.56876: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204145.56896: Calling all_plugins_play to load vars for managed-node2 15980 1727204145.56901: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204145.56905: Calling groups_plugins_play to load vars for managed-node2 15980 1727204145.57247: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000000d 15980 1727204145.57250: WORKER PROCESS EXITING 15980 1727204145.57285: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204145.58042: done with get_vars() 15980 1727204145.58079: variable 'ansible_search_path' from source: unknown 15980 1727204145.58096: we have included files to process 15980 1727204145.58097: generating all_blocks data 15980 1727204145.58103: done generating all_blocks data 15980 1727204145.58109: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 15980 1727204145.58111: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 15980 1727204145.58113: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 15980 1727204145.58584: in VariableManager get_vars() 15980 1727204145.58602: done with get_vars() 15980 1727204145.58825: done processing included file 15980 1727204145.58829: iterating over new_blocks loaded from include file 15980 1727204145.58831: in VariableManager get_vars() 15980 1727204145.58851: done with get_vars() 15980 1727204145.58853: filtering new block on tags 15980 1727204145.58952: done filtering new block on tags 15980 1727204145.58959: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed-node2 15980 1727204145.58969: extending task lists for all hosts with included blocks 15980 1727204145.59474: done extending task lists 15980 1727204145.59476: done processing included files 15980 1727204145.59476: results queue empty 15980 1727204145.59477: checking for any_errors_fatal 15980 1727204145.59481: done checking for any_errors_fatal 15980 1727204145.59482: checking for max_fail_percentage 15980 1727204145.59483: done checking for max_fail_percentage 15980 1727204145.59484: checking to see if all hosts have failed and the running result is not ok 15980 1727204145.59487: done checking to see if all hosts have failed 15980 1727204145.59489: getting the remaining hosts for this loop 15980 1727204145.59491: done getting the remaining hosts for this loop 15980 1727204145.59494: getting the next task for host managed-node2 15980 1727204145.59498: done getting next task for host managed-node2 15980 1727204145.59501: ^ task is: TASK: Include the task 'get_interface_stat.yml' 15980 1727204145.59508: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204145.59515: getting variables 15980 1727204145.59516: in VariableManager get_vars() 15980 1727204145.59639: Calling all_inventory to load vars for managed-node2 15980 1727204145.59642: Calling groups_inventory to load vars for managed-node2 15980 1727204145.59645: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204145.59676: Calling all_plugins_play to load vars for managed-node2 15980 1727204145.59683: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204145.59687: Calling groups_plugins_play to load vars for managed-node2 15980 1727204145.59902: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204145.60044: done with get_vars() 15980 1727204145.60053: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Tuesday 24 September 2024 14:55:45 -0400 (0:00:00.056) 0:00:07.011 ***** 15980 1727204145.60115: entering _queue_task() for managed-node2/include_tasks 15980 1727204145.60449: worker is 1 (out of 1 available) 15980 1727204145.60469: exiting _queue_task() for managed-node2/include_tasks 15980 1727204145.60482: done queuing things up, now waiting for results queue to drain 15980 1727204145.60484: waiting for pending results... 15980 1727204145.60725: running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' 15980 1727204145.60823: in run() - task 127b8e07-fff9-5f1d-4b72-000000000119 15980 1727204145.60836: variable 'ansible_search_path' from source: unknown 15980 1727204145.60840: variable 'ansible_search_path' from source: unknown 15980 1727204145.60884: calling self._execute() 15980 1727204145.60967: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204145.60974: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204145.60985: variable 'omit' from source: magic vars 15980 1727204145.61356: variable 'ansible_distribution_major_version' from source: facts 15980 1727204145.61369: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204145.61376: _execute() done 15980 1727204145.61379: dumping result to json 15980 1727204145.61382: done dumping result, returning 15980 1727204145.61388: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' [127b8e07-fff9-5f1d-4b72-000000000119] 15980 1727204145.61393: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000119 15980 1727204145.61500: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000119 15980 1727204145.61503: WORKER PROCESS EXITING 15980 1727204145.61543: no more pending results, returning what we have 15980 1727204145.61549: in VariableManager get_vars() 15980 1727204145.61587: Calling all_inventory to load vars for managed-node2 15980 1727204145.61590: Calling groups_inventory to load vars for managed-node2 15980 1727204145.61594: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204145.61607: Calling all_plugins_play to load vars for managed-node2 15980 1727204145.61611: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204145.61616: Calling groups_plugins_play to load vars for managed-node2 15980 1727204145.61810: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204145.61956: done with get_vars() 15980 1727204145.61963: variable 'ansible_search_path' from source: unknown 15980 1727204145.61964: variable 'ansible_search_path' from source: unknown 15980 1727204145.61994: we have included files to process 15980 1727204145.61994: generating all_blocks data 15980 1727204145.61996: done generating all_blocks data 15980 1727204145.61997: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15980 1727204145.61998: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15980 1727204145.61999: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15980 1727204145.62214: done processing included file 15980 1727204145.62215: iterating over new_blocks loaded from include file 15980 1727204145.62217: in VariableManager get_vars() 15980 1727204145.62228: done with get_vars() 15980 1727204145.62229: filtering new block on tags 15980 1727204145.62250: done filtering new block on tags 15980 1727204145.62253: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node2 15980 1727204145.62258: extending task lists for all hosts with included blocks 15980 1727204145.62373: done extending task lists 15980 1727204145.62374: done processing included files 15980 1727204145.62375: results queue empty 15980 1727204145.62376: checking for any_errors_fatal 15980 1727204145.62379: done checking for any_errors_fatal 15980 1727204145.62380: checking for max_fail_percentage 15980 1727204145.62381: done checking for max_fail_percentage 15980 1727204145.62382: checking to see if all hosts have failed and the running result is not ok 15980 1727204145.62383: done checking to see if all hosts have failed 15980 1727204145.62384: getting the remaining hosts for this loop 15980 1727204145.62385: done getting the remaining hosts for this loop 15980 1727204145.62387: getting the next task for host managed-node2 15980 1727204145.62392: done getting next task for host managed-node2 15980 1727204145.62394: ^ task is: TASK: Get stat for interface {{ interface }} 15980 1727204145.62397: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204145.62399: getting variables 15980 1727204145.62400: in VariableManager get_vars() 15980 1727204145.62410: Calling all_inventory to load vars for managed-node2 15980 1727204145.62412: Calling groups_inventory to load vars for managed-node2 15980 1727204145.62414: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204145.62420: Calling all_plugins_play to load vars for managed-node2 15980 1727204145.62422: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204145.62425: Calling groups_plugins_play to load vars for managed-node2 15980 1727204145.62607: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204145.62804: done with get_vars() 15980 1727204145.62813: done getting variables 15980 1727204145.62976: variable 'interface' from source: set_fact TASK [Get stat for interface LSR-TST-br31] ************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:55:45 -0400 (0:00:00.028) 0:00:07.040 ***** 15980 1727204145.63006: entering _queue_task() for managed-node2/stat 15980 1727204145.63417: worker is 1 (out of 1 available) 15980 1727204145.63431: exiting _queue_task() for managed-node2/stat 15980 1727204145.63443: done queuing things up, now waiting for results queue to drain 15980 1727204145.63445: waiting for pending results... 15980 1727204145.63943: running TaskExecutor() for managed-node2/TASK: Get stat for interface LSR-TST-br31 15980 1727204145.63949: in run() - task 127b8e07-fff9-5f1d-4b72-000000000133 15980 1727204145.63952: variable 'ansible_search_path' from source: unknown 15980 1727204145.63955: variable 'ansible_search_path' from source: unknown 15980 1727204145.64036: calling self._execute() 15980 1727204145.64129: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204145.64141: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204145.64156: variable 'omit' from source: magic vars 15980 1727204145.64577: variable 'ansible_distribution_major_version' from source: facts 15980 1727204145.64642: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204145.64646: variable 'omit' from source: magic vars 15980 1727204145.64648: variable 'omit' from source: magic vars 15980 1727204145.64758: variable 'interface' from source: set_fact 15980 1727204145.64787: variable 'omit' from source: magic vars 15980 1727204145.64834: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204145.64881: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204145.64907: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204145.64930: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204145.64949: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204145.64992: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204145.65001: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204145.65071: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204145.65112: Set connection var ansible_connection to ssh 15980 1727204145.65123: Set connection var ansible_pipelining to False 15980 1727204145.65133: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204145.65142: Set connection var ansible_timeout to 10 15980 1727204145.65150: Set connection var ansible_shell_type to sh 15980 1727204145.65158: Set connection var ansible_shell_executable to /bin/sh 15980 1727204145.65195: variable 'ansible_shell_executable' from source: unknown 15980 1727204145.65204: variable 'ansible_connection' from source: unknown 15980 1727204145.65213: variable 'ansible_module_compression' from source: unknown 15980 1727204145.65220: variable 'ansible_shell_type' from source: unknown 15980 1727204145.65227: variable 'ansible_shell_executable' from source: unknown 15980 1727204145.65234: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204145.65242: variable 'ansible_pipelining' from source: unknown 15980 1727204145.65248: variable 'ansible_timeout' from source: unknown 15980 1727204145.65256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204145.65486: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15980 1727204145.65570: variable 'omit' from source: magic vars 15980 1727204145.65574: starting attempt loop 15980 1727204145.65579: running the handler 15980 1727204145.65582: _low_level_execute_command(): starting 15980 1727204145.65584: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15980 1727204145.66403: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204145.66424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204145.66492: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204145.66495: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204145.66502: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204145.66583: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204145.68409: stdout chunk (state=3): >>>/root <<< 15980 1727204145.68625: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204145.68636: stdout chunk (state=3): >>><<< 15980 1727204145.68639: stderr chunk (state=3): >>><<< 15980 1727204145.68689: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204145.68763: _low_level_execute_command(): starting 15980 1727204145.68769: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204145.686965-16826-223654359919141 `" && echo ansible-tmp-1727204145.686965-16826-223654359919141="` echo /root/.ansible/tmp/ansible-tmp-1727204145.686965-16826-223654359919141 `" ) && sleep 0' 15980 1727204145.70272: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204145.70342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204145.70387: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204145.70450: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204145.72455: stdout chunk (state=3): >>>ansible-tmp-1727204145.686965-16826-223654359919141=/root/.ansible/tmp/ansible-tmp-1727204145.686965-16826-223654359919141 <<< 15980 1727204145.72582: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204145.72662: stderr chunk (state=3): >>><<< 15980 1727204145.72668: stdout chunk (state=3): >>><<< 15980 1727204145.72685: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204145.686965-16826-223654359919141=/root/.ansible/tmp/ansible-tmp-1727204145.686965-16826-223654359919141 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204145.72738: variable 'ansible_module_compression' from source: unknown 15980 1727204145.72786: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15980vtkmnsww/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 15980 1727204145.72817: variable 'ansible_facts' from source: unknown 15980 1727204145.72887: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204145.686965-16826-223654359919141/AnsiballZ_stat.py 15980 1727204145.73016: Sending initial data 15980 1727204145.73020: Sent initial data (152 bytes) 15980 1727204145.73788: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204145.73816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204145.73858: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204145.73928: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204145.74138: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204145.75710: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15980 1727204145.75792: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15980 1727204145.75921: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15980vtkmnsww/tmpq2ihrpsu /root/.ansible/tmp/ansible-tmp-1727204145.686965-16826-223654359919141/AnsiballZ_stat.py <<< 15980 1727204145.75925: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204145.686965-16826-223654359919141/AnsiballZ_stat.py" <<< 15980 1727204145.75977: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15980vtkmnsww/tmpq2ihrpsu" to remote "/root/.ansible/tmp/ansible-tmp-1727204145.686965-16826-223654359919141/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204145.686965-16826-223654359919141/AnsiballZ_stat.py" <<< 15980 1727204145.76990: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204145.77039: stderr chunk (state=3): >>><<< 15980 1727204145.77048: stdout chunk (state=3): >>><<< 15980 1727204145.77082: done transferring module to remote 15980 1727204145.77099: _low_level_execute_command(): starting 15980 1727204145.77120: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204145.686965-16826-223654359919141/ /root/.ansible/tmp/ansible-tmp-1727204145.686965-16826-223654359919141/AnsiballZ_stat.py && sleep 0' 15980 1727204145.77916: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204145.77939: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204145.77968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204145.78002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15980 1727204145.78132: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204145.78149: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204145.78189: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204145.78215: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204145.78347: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204145.80438: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204145.80442: stdout chunk (state=3): >>><<< 15980 1727204145.80445: stderr chunk (state=3): >>><<< 15980 1727204145.80447: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204145.80458: _low_level_execute_command(): starting 15980 1727204145.80460: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204145.686965-16826-223654359919141/AnsiballZ_stat.py && sleep 0' 15980 1727204145.81332: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204145.81370: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204145.81387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204145.81406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15980 1727204145.81440: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 15980 1727204145.81562: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204145.81668: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204145.81747: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204145.98329: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 15980 1727204145.99656: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 15980 1727204145.99660: stderr chunk (state=3): >>><<< 15980 1727204145.99662: stdout chunk (state=3): >>><<< 15980 1727204145.99668: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 15980 1727204145.99815: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204145.686965-16826-223654359919141/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15980 1727204145.99819: _low_level_execute_command(): starting 15980 1727204145.99822: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204145.686965-16826-223654359919141/ > /dev/null 2>&1 && sleep 0' 15980 1727204146.01216: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204146.01288: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204146.01510: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204146.01557: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204146.01635: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204146.03626: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204146.03954: stderr chunk (state=3): >>><<< 15980 1727204146.03958: stdout chunk (state=3): >>><<< 15980 1727204146.03961: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204146.03963: handler run complete 15980 1727204146.03968: attempt loop complete, returning result 15980 1727204146.03970: _execute() done 15980 1727204146.03972: dumping result to json 15980 1727204146.03975: done dumping result, returning 15980 1727204146.03977: done running TaskExecutor() for managed-node2/TASK: Get stat for interface LSR-TST-br31 [127b8e07-fff9-5f1d-4b72-000000000133] 15980 1727204146.03983: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000133 15980 1727204146.04252: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000133 15980 1727204146.04256: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 15980 1727204146.04347: no more pending results, returning what we have 15980 1727204146.04351: results queue empty 15980 1727204146.04353: checking for any_errors_fatal 15980 1727204146.04354: done checking for any_errors_fatal 15980 1727204146.04355: checking for max_fail_percentage 15980 1727204146.04357: done checking for max_fail_percentage 15980 1727204146.04358: checking to see if all hosts have failed and the running result is not ok 15980 1727204146.04359: done checking to see if all hosts have failed 15980 1727204146.04360: getting the remaining hosts for this loop 15980 1727204146.04362: done getting the remaining hosts for this loop 15980 1727204146.04571: getting the next task for host managed-node2 15980 1727204146.04583: done getting next task for host managed-node2 15980 1727204146.04585: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 15980 1727204146.04590: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204146.04596: getting variables 15980 1727204146.04597: in VariableManager get_vars() 15980 1727204146.04630: Calling all_inventory to load vars for managed-node2 15980 1727204146.04634: Calling groups_inventory to load vars for managed-node2 15980 1727204146.04638: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204146.04652: Calling all_plugins_play to load vars for managed-node2 15980 1727204146.04655: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204146.04659: Calling groups_plugins_play to load vars for managed-node2 15980 1727204146.05564: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204146.06532: done with get_vars() 15980 1727204146.06546: done getting variables 15980 1727204146.06957: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 15980 1727204146.07561: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'LSR-TST-br31'] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Tuesday 24 September 2024 14:55:46 -0400 (0:00:00.450) 0:00:07.490 ***** 15980 1727204146.08021: entering _queue_task() for managed-node2/assert 15980 1727204146.08023: Creating lock for assert 15980 1727204146.09352: worker is 1 (out of 1 available) 15980 1727204146.09372: exiting _queue_task() for managed-node2/assert 15980 1727204146.09387: done queuing things up, now waiting for results queue to drain 15980 1727204146.09390: waiting for pending results... 15980 1727204146.10587: running TaskExecutor() for managed-node2/TASK: Assert that the interface is absent - 'LSR-TST-br31' 15980 1727204146.10593: in run() - task 127b8e07-fff9-5f1d-4b72-00000000011a 15980 1727204146.10597: variable 'ansible_search_path' from source: unknown 15980 1727204146.10975: variable 'ansible_search_path' from source: unknown 15980 1727204146.10979: calling self._execute() 15980 1727204146.12875: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204146.12880: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204146.12883: variable 'omit' from source: magic vars 15980 1727204146.14075: variable 'ansible_distribution_major_version' from source: facts 15980 1727204146.14080: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204146.14082: variable 'omit' from source: magic vars 15980 1727204146.14373: variable 'omit' from source: magic vars 15980 1727204146.14625: variable 'interface' from source: set_fact 15980 1727204146.14656: variable 'omit' from source: magic vars 15980 1727204146.14705: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204146.14911: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204146.15173: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204146.15177: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204146.15180: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204146.15183: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204146.15185: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204146.15277: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204146.15575: Set connection var ansible_connection to ssh 15980 1727204146.15742: Set connection var ansible_pipelining to False 15980 1727204146.15763: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204146.15793: Set connection var ansible_timeout to 10 15980 1727204146.15810: Set connection var ansible_shell_type to sh 15980 1727204146.15887: Set connection var ansible_shell_executable to /bin/sh 15980 1727204146.15928: variable 'ansible_shell_executable' from source: unknown 15980 1727204146.16085: variable 'ansible_connection' from source: unknown 15980 1727204146.16099: variable 'ansible_module_compression' from source: unknown 15980 1727204146.16107: variable 'ansible_shell_type' from source: unknown 15980 1727204146.16116: variable 'ansible_shell_executable' from source: unknown 15980 1727204146.16123: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204146.16134: variable 'ansible_pipelining' from source: unknown 15980 1727204146.16228: variable 'ansible_timeout' from source: unknown 15980 1727204146.16232: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204146.16572: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15980 1727204146.16577: variable 'omit' from source: magic vars 15980 1727204146.16584: starting attempt loop 15980 1727204146.16591: running the handler 15980 1727204146.17272: variable 'interface_stat' from source: set_fact 15980 1727204146.17276: Evaluated conditional (not interface_stat.stat.exists): True 15980 1727204146.17279: handler run complete 15980 1727204146.17281: attempt loop complete, returning result 15980 1727204146.17283: _execute() done 15980 1727204146.17285: dumping result to json 15980 1727204146.17287: done dumping result, returning 15980 1727204146.17290: done running TaskExecutor() for managed-node2/TASK: Assert that the interface is absent - 'LSR-TST-br31' [127b8e07-fff9-5f1d-4b72-00000000011a] 15980 1727204146.17292: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000011a 15980 1727204146.18276: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000011a 15980 1727204146.18284: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 15980 1727204146.18412: no more pending results, returning what we have 15980 1727204146.18415: results queue empty 15980 1727204146.18416: checking for any_errors_fatal 15980 1727204146.18420: done checking for any_errors_fatal 15980 1727204146.18420: checking for max_fail_percentage 15980 1727204146.18422: done checking for max_fail_percentage 15980 1727204146.18422: checking to see if all hosts have failed and the running result is not ok 15980 1727204146.18423: done checking to see if all hosts have failed 15980 1727204146.18424: getting the remaining hosts for this loop 15980 1727204146.18425: done getting the remaining hosts for this loop 15980 1727204146.18431: getting the next task for host managed-node2 15980 1727204146.18437: done getting next task for host managed-node2 15980 1727204146.18439: ^ task is: TASK: meta (flush_handlers) 15980 1727204146.18441: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204146.18444: getting variables 15980 1727204146.18445: in VariableManager get_vars() 15980 1727204146.18471: Calling all_inventory to load vars for managed-node2 15980 1727204146.18474: Calling groups_inventory to load vars for managed-node2 15980 1727204146.18477: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204146.18487: Calling all_plugins_play to load vars for managed-node2 15980 1727204146.18490: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204146.18494: Calling groups_plugins_play to load vars for managed-node2 15980 1727204146.19322: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204146.19848: done with get_vars() 15980 1727204146.19861: done getting variables 15980 1727204146.20144: in VariableManager get_vars() 15980 1727204146.20157: Calling all_inventory to load vars for managed-node2 15980 1727204146.20159: Calling groups_inventory to load vars for managed-node2 15980 1727204146.20162: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204146.20170: Calling all_plugins_play to load vars for managed-node2 15980 1727204146.20173: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204146.20176: Calling groups_plugins_play to load vars for managed-node2 15980 1727204146.20597: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204146.21022: done with get_vars() 15980 1727204146.21042: done queuing things up, now waiting for results queue to drain 15980 1727204146.21044: results queue empty 15980 1727204146.21045: checking for any_errors_fatal 15980 1727204146.21048: done checking for any_errors_fatal 15980 1727204146.21049: checking for max_fail_percentage 15980 1727204146.21050: done checking for max_fail_percentage 15980 1727204146.21051: checking to see if all hosts have failed and the running result is not ok 15980 1727204146.21052: done checking to see if all hosts have failed 15980 1727204146.21058: getting the remaining hosts for this loop 15980 1727204146.21059: done getting the remaining hosts for this loop 15980 1727204146.21062: getting the next task for host managed-node2 15980 1727204146.21068: done getting next task for host managed-node2 15980 1727204146.21070: ^ task is: TASK: meta (flush_handlers) 15980 1727204146.21071: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204146.21074: getting variables 15980 1727204146.21075: in VariableManager get_vars() 15980 1727204146.21084: Calling all_inventory to load vars for managed-node2 15980 1727204146.21087: Calling groups_inventory to load vars for managed-node2 15980 1727204146.21089: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204146.21095: Calling all_plugins_play to load vars for managed-node2 15980 1727204146.21098: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204146.21101: Calling groups_plugins_play to load vars for managed-node2 15980 1727204146.21454: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204146.21859: done with get_vars() 15980 1727204146.21872: done getting variables 15980 1727204146.21931: in VariableManager get_vars() 15980 1727204146.21942: Calling all_inventory to load vars for managed-node2 15980 1727204146.21944: Calling groups_inventory to load vars for managed-node2 15980 1727204146.21947: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204146.21952: Calling all_plugins_play to load vars for managed-node2 15980 1727204146.21955: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204146.21958: Calling groups_plugins_play to load vars for managed-node2 15980 1727204146.22310: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204146.22747: done with get_vars() 15980 1727204146.22762: done queuing things up, now waiting for results queue to drain 15980 1727204146.22764: results queue empty 15980 1727204146.22969: checking for any_errors_fatal 15980 1727204146.22971: done checking for any_errors_fatal 15980 1727204146.22972: checking for max_fail_percentage 15980 1727204146.22973: done checking for max_fail_percentage 15980 1727204146.22974: checking to see if all hosts have failed and the running result is not ok 15980 1727204146.22975: done checking to see if all hosts have failed 15980 1727204146.22976: getting the remaining hosts for this loop 15980 1727204146.22977: done getting the remaining hosts for this loop 15980 1727204146.22980: getting the next task for host managed-node2 15980 1727204146.22984: done getting next task for host managed-node2 15980 1727204146.22985: ^ task is: None 15980 1727204146.22987: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204146.22988: done queuing things up, now waiting for results queue to drain 15980 1727204146.22989: results queue empty 15980 1727204146.22990: checking for any_errors_fatal 15980 1727204146.22990: done checking for any_errors_fatal 15980 1727204146.22991: checking for max_fail_percentage 15980 1727204146.22992: done checking for max_fail_percentage 15980 1727204146.22993: checking to see if all hosts have failed and the running result is not ok 15980 1727204146.22994: done checking to see if all hosts have failed 15980 1727204146.22996: getting the next task for host managed-node2 15980 1727204146.22998: done getting next task for host managed-node2 15980 1727204146.22999: ^ task is: None 15980 1727204146.23001: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204146.23286: in VariableManager get_vars() 15980 1727204146.23312: done with get_vars() 15980 1727204146.23318: in VariableManager get_vars() 15980 1727204146.23334: done with get_vars() 15980 1727204146.23339: variable 'omit' from source: magic vars 15980 1727204146.23376: in VariableManager get_vars() 15980 1727204146.23391: done with get_vars() 15980 1727204146.23415: variable 'omit' from source: magic vars PLAY [Add test bridge] ********************************************************* 15980 1727204146.24558: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15980 1727204146.24884: getting the remaining hosts for this loop 15980 1727204146.24886: done getting the remaining hosts for this loop 15980 1727204146.24889: getting the next task for host managed-node2 15980 1727204146.24893: done getting next task for host managed-node2 15980 1727204146.24896: ^ task is: TASK: Gathering Facts 15980 1727204146.24897: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204146.24900: getting variables 15980 1727204146.24901: in VariableManager get_vars() 15980 1727204146.24923: Calling all_inventory to load vars for managed-node2 15980 1727204146.24926: Calling groups_inventory to load vars for managed-node2 15980 1727204146.24928: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204146.24935: Calling all_plugins_play to load vars for managed-node2 15980 1727204146.24938: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204146.24941: Calling groups_plugins_play to load vars for managed-node2 15980 1727204146.25156: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204146.25375: done with get_vars() 15980 1727204146.25386: done getting variables 15980 1727204146.25441: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:17 Tuesday 24 September 2024 14:55:46 -0400 (0:00:00.175) 0:00:07.666 ***** 15980 1727204146.25588: entering _queue_task() for managed-node2/gather_facts 15980 1727204146.26674: worker is 1 (out of 1 available) 15980 1727204146.26686: exiting _queue_task() for managed-node2/gather_facts 15980 1727204146.26698: done queuing things up, now waiting for results queue to drain 15980 1727204146.26700: waiting for pending results... 15980 1727204146.27635: running TaskExecutor() for managed-node2/TASK: Gathering Facts 15980 1727204146.27784: in run() - task 127b8e07-fff9-5f1d-4b72-00000000014c 15980 1727204146.27855: variable 'ansible_search_path' from source: unknown 15980 1727204146.28053: calling self._execute() 15980 1727204146.28194: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204146.28273: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204146.28288: variable 'omit' from source: magic vars 15980 1727204146.29677: variable 'ansible_distribution_major_version' from source: facts 15980 1727204146.29682: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204146.29686: variable 'omit' from source: magic vars 15980 1727204146.29689: variable 'omit' from source: magic vars 15980 1727204146.29713: variable 'omit' from source: magic vars 15980 1727204146.29762: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204146.29815: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204146.29917: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204146.29942: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204146.30017: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204146.30056: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204146.30115: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204146.30124: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204146.30354: Set connection var ansible_connection to ssh 15980 1727204146.30371: Set connection var ansible_pipelining to False 15980 1727204146.30383: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204146.30445: Set connection var ansible_timeout to 10 15980 1727204146.30456: Set connection var ansible_shell_type to sh 15980 1727204146.30470: Set connection var ansible_shell_executable to /bin/sh 15980 1727204146.30506: variable 'ansible_shell_executable' from source: unknown 15980 1727204146.30549: variable 'ansible_connection' from source: unknown 15980 1727204146.30761: variable 'ansible_module_compression' from source: unknown 15980 1727204146.30767: variable 'ansible_shell_type' from source: unknown 15980 1727204146.30770: variable 'ansible_shell_executable' from source: unknown 15980 1727204146.30773: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204146.30775: variable 'ansible_pipelining' from source: unknown 15980 1727204146.30778: variable 'ansible_timeout' from source: unknown 15980 1727204146.30780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204146.31079: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15980 1727204146.31273: variable 'omit' from source: magic vars 15980 1727204146.31277: starting attempt loop 15980 1727204146.31280: running the handler 15980 1727204146.31282: variable 'ansible_facts' from source: unknown 15980 1727204146.31284: _low_level_execute_command(): starting 15980 1727204146.31286: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15980 1727204146.33203: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204146.33367: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204146.33371: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204146.33374: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204146.33555: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204146.35233: stdout chunk (state=3): >>>/root <<< 15980 1727204146.35400: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204146.35470: stderr chunk (state=3): >>><<< 15980 1727204146.35474: stdout chunk (state=3): >>><<< 15980 1727204146.35675: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204146.35680: _low_level_execute_command(): starting 15980 1727204146.35683: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204146.3551373-16946-46673068077612 `" && echo ansible-tmp-1727204146.3551373-16946-46673068077612="` echo /root/.ansible/tmp/ansible-tmp-1727204146.3551373-16946-46673068077612 `" ) && sleep 0' 15980 1727204146.37043: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204146.37187: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204146.37251: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204146.37408: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204146.37431: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204146.37532: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204146.39707: stdout chunk (state=3): >>>ansible-tmp-1727204146.3551373-16946-46673068077612=/root/.ansible/tmp/ansible-tmp-1727204146.3551373-16946-46673068077612 <<< 15980 1727204146.39786: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204146.39813: stderr chunk (state=3): >>><<< 15980 1727204146.39870: stdout chunk (state=3): >>><<< 15980 1727204146.39891: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204146.3551373-16946-46673068077612=/root/.ansible/tmp/ansible-tmp-1727204146.3551373-16946-46673068077612 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204146.40069: variable 'ansible_module_compression' from source: unknown 15980 1727204146.40149: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15980vtkmnsww/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15980 1727204146.40222: variable 'ansible_facts' from source: unknown 15980 1727204146.40690: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204146.3551373-16946-46673068077612/AnsiballZ_setup.py 15980 1727204146.41024: Sending initial data 15980 1727204146.41117: Sent initial data (153 bytes) 15980 1727204146.42672: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204146.42888: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204146.42963: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204146.43133: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204146.44677: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15980 1727204146.44782: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15980 1727204146.44817: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15980vtkmnsww/tmpm368gbog /root/.ansible/tmp/ansible-tmp-1727204146.3551373-16946-46673068077612/AnsiballZ_setup.py <<< 15980 1727204146.44821: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204146.3551373-16946-46673068077612/AnsiballZ_setup.py" <<< 15980 1727204146.44998: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15980vtkmnsww/tmpm368gbog" to remote "/root/.ansible/tmp/ansible-tmp-1727204146.3551373-16946-46673068077612/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204146.3551373-16946-46673068077612/AnsiballZ_setup.py" <<< 15980 1727204146.48471: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204146.48677: stderr chunk (state=3): >>><<< 15980 1727204146.48682: stdout chunk (state=3): >>><<< 15980 1727204146.48684: done transferring module to remote 15980 1727204146.48687: _low_level_execute_command(): starting 15980 1727204146.48689: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204146.3551373-16946-46673068077612/ /root/.ansible/tmp/ansible-tmp-1727204146.3551373-16946-46673068077612/AnsiballZ_setup.py && sleep 0' 15980 1727204146.50508: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204146.50513: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15980 1727204146.50589: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 15980 1727204146.50710: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 15980 1727204146.50724: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204146.50791: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204146.50841: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204146.50872: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204146.51023: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204146.52879: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204146.53438: stderr chunk (state=3): >>><<< 15980 1727204146.53443: stdout chunk (state=3): >>><<< 15980 1727204146.53445: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204146.53448: _low_level_execute_command(): starting 15980 1727204146.53451: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204146.3551373-16946-46673068077612/AnsiballZ_setup.py && sleep 0' 15980 1727204146.54852: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204146.55086: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204146.55090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204146.55126: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204146.55314: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204146.55345: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204146.55421: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204146.55683: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204147.19105: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_iscsi_iqn": "", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_apparmor": {"status": "disabled"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fibre_channel_wwn": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_loadavg": {"1m": 0.80859375, "5m": 0.52001953125, "15m": 0.25146484375}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3037, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 679, "free": 3037}, "nocache": {"free": 3466, "used": 250}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_uuid": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 493, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251325779968, "block_size": 4096, "block_total": 64479564, "block_available": 61358833, "block_used": 3120731, "inode_total": 16384000, "inode_available": 16301510, "inode_used": 82490, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_is_chroot": false, "ansible_fips": false, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::f7:13ff:fe22:8fc1", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.73"], "ansible_all_ipv6_addresses": ["fe80::f7:13ff:fe22:8fc1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.73", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::f7:13ff:fe22:8fc1"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "55", "second": "47", "epoch": "1727204147", "epoch_int": "1727204147", "date": "2024-09-24", "time": "14:55:47", "iso8601_micro": "2024-09-24T18:55:47.185905Z", "iso8601": "2024-09-24T18:55:47Z", "iso8601_basic": "20240924T145547185905", "iso8601_basic_short": "20240924T145547", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_local": {}, "ansible_lsb": {}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15980 1727204147.21182: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 15980 1727204147.21186: stdout chunk (state=3): >>><<< 15980 1727204147.21188: stderr chunk (state=3): >>><<< 15980 1727204147.21388: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_iscsi_iqn": "", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_apparmor": {"status": "disabled"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fibre_channel_wwn": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_loadavg": {"1m": 0.80859375, "5m": 0.52001953125, "15m": 0.25146484375}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3037, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 679, "free": 3037}, "nocache": {"free": 3466, "used": 250}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_uuid": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 493, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251325779968, "block_size": 4096, "block_total": 64479564, "block_available": 61358833, "block_used": 3120731, "inode_total": 16384000, "inode_available": 16301510, "inode_used": 82490, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_is_chroot": false, "ansible_fips": false, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::f7:13ff:fe22:8fc1", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.73"], "ansible_all_ipv6_addresses": ["fe80::f7:13ff:fe22:8fc1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.73", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::f7:13ff:fe22:8fc1"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "55", "second": "47", "epoch": "1727204147", "epoch_int": "1727204147", "date": "2024-09-24", "time": "14:55:47", "iso8601_micro": "2024-09-24T18:55:47.185905Z", "iso8601": "2024-09-24T18:55:47Z", "iso8601_basic": "20240924T145547185905", "iso8601_basic_short": "20240924T145547", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_local": {}, "ansible_lsb": {}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 15980 1727204147.22549: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204146.3551373-16946-46673068077612/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15980 1727204147.22672: _low_level_execute_command(): starting 15980 1727204147.22676: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204146.3551373-16946-46673068077612/ > /dev/null 2>&1 && sleep 0' 15980 1727204147.24954: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204147.25192: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204147.25477: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204147.25495: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204147.25590: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204147.27620: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204147.27625: stdout chunk (state=3): >>><<< 15980 1727204147.27628: stderr chunk (state=3): >>><<< 15980 1727204147.27647: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204147.28075: handler run complete 15980 1727204147.28129: variable 'ansible_facts' from source: unknown 15980 1727204147.28472: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204147.29481: variable 'ansible_facts' from source: unknown 15980 1727204147.29887: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204147.30157: attempt loop complete, returning result 15980 1727204147.30192: _execute() done 15980 1727204147.30258: dumping result to json 15980 1727204147.30297: done dumping result, returning 15980 1727204147.30310: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [127b8e07-fff9-5f1d-4b72-00000000014c] 15980 1727204147.30319: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000014c ok: [managed-node2] 15980 1727204147.31547: no more pending results, returning what we have 15980 1727204147.31550: results queue empty 15980 1727204147.31551: checking for any_errors_fatal 15980 1727204147.31552: done checking for any_errors_fatal 15980 1727204147.31554: checking for max_fail_percentage 15980 1727204147.31556: done checking for max_fail_percentage 15980 1727204147.31557: checking to see if all hosts have failed and the running result is not ok 15980 1727204147.31557: done checking to see if all hosts have failed 15980 1727204147.31558: getting the remaining hosts for this loop 15980 1727204147.31560: done getting the remaining hosts for this loop 15980 1727204147.31564: getting the next task for host managed-node2 15980 1727204147.31572: done getting next task for host managed-node2 15980 1727204147.31574: ^ task is: TASK: meta (flush_handlers) 15980 1727204147.31576: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204147.31580: getting variables 15980 1727204147.31581: in VariableManager get_vars() 15980 1727204147.31810: Calling all_inventory to load vars for managed-node2 15980 1727204147.31814: Calling groups_inventory to load vars for managed-node2 15980 1727204147.31816: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204147.31830: Calling all_plugins_play to load vars for managed-node2 15980 1727204147.31834: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204147.31838: Calling groups_plugins_play to load vars for managed-node2 15980 1727204147.32189: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204147.32518: done with get_vars() 15980 1727204147.32533: done getting variables 15980 1727204147.32686: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000014c 15980 1727204147.32689: WORKER PROCESS EXITING 15980 1727204147.32742: in VariableManager get_vars() 15980 1727204147.32758: Calling all_inventory to load vars for managed-node2 15980 1727204147.32761: Calling groups_inventory to load vars for managed-node2 15980 1727204147.32763: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204147.32873: Calling all_plugins_play to load vars for managed-node2 15980 1727204147.32877: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204147.32881: Calling groups_plugins_play to load vars for managed-node2 15980 1727204147.33269: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204147.33704: done with get_vars() 15980 1727204147.33722: done queuing things up, now waiting for results queue to drain 15980 1727204147.33724: results queue empty 15980 1727204147.33728: checking for any_errors_fatal 15980 1727204147.33733: done checking for any_errors_fatal 15980 1727204147.33734: checking for max_fail_percentage 15980 1727204147.33735: done checking for max_fail_percentage 15980 1727204147.33736: checking to see if all hosts have failed and the running result is not ok 15980 1727204147.33736: done checking to see if all hosts have failed 15980 1727204147.33742: getting the remaining hosts for this loop 15980 1727204147.33743: done getting the remaining hosts for this loop 15980 1727204147.33746: getting the next task for host managed-node2 15980 1727204147.33750: done getting next task for host managed-node2 15980 1727204147.33753: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15980 1727204147.33755: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204147.33883: getting variables 15980 1727204147.33885: in VariableManager get_vars() 15980 1727204147.33901: Calling all_inventory to load vars for managed-node2 15980 1727204147.33903: Calling groups_inventory to load vars for managed-node2 15980 1727204147.33905: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204147.33911: Calling all_plugins_play to load vars for managed-node2 15980 1727204147.33913: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204147.33915: Calling groups_plugins_play to load vars for managed-node2 15980 1727204147.34322: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204147.34744: done with get_vars() 15980 1727204147.34755: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:55:47 -0400 (0:00:01.093) 0:00:08.759 ***** 15980 1727204147.34913: entering _queue_task() for managed-node2/include_tasks 15980 1727204147.35825: worker is 1 (out of 1 available) 15980 1727204147.35843: exiting _queue_task() for managed-node2/include_tasks 15980 1727204147.35971: done queuing things up, now waiting for results queue to drain 15980 1727204147.35975: waiting for pending results... 15980 1727204147.36357: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15980 1727204147.36675: in run() - task 127b8e07-fff9-5f1d-4b72-000000000014 15980 1727204147.36702: variable 'ansible_search_path' from source: unknown 15980 1727204147.36710: variable 'ansible_search_path' from source: unknown 15980 1727204147.36768: calling self._execute() 15980 1727204147.37057: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204147.37070: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204147.37087: variable 'omit' from source: magic vars 15980 1727204147.37891: variable 'ansible_distribution_major_version' from source: facts 15980 1727204147.38147: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204147.38151: _execute() done 15980 1727204147.38154: dumping result to json 15980 1727204147.38157: done dumping result, returning 15980 1727204147.38160: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [127b8e07-fff9-5f1d-4b72-000000000014] 15980 1727204147.38162: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000014 15980 1727204147.38407: no more pending results, returning what we have 15980 1727204147.38414: in VariableManager get_vars() 15980 1727204147.38467: Calling all_inventory to load vars for managed-node2 15980 1727204147.38471: Calling groups_inventory to load vars for managed-node2 15980 1727204147.38473: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204147.38490: Calling all_plugins_play to load vars for managed-node2 15980 1727204147.38494: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204147.38499: Calling groups_plugins_play to load vars for managed-node2 15980 1727204147.39101: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000014 15980 1727204147.39106: WORKER PROCESS EXITING 15980 1727204147.39151: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204147.39650: done with get_vars() 15980 1727204147.39660: variable 'ansible_search_path' from source: unknown 15980 1727204147.39661: variable 'ansible_search_path' from source: unknown 15980 1727204147.39809: we have included files to process 15980 1727204147.39810: generating all_blocks data 15980 1727204147.39812: done generating all_blocks data 15980 1727204147.39813: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15980 1727204147.39814: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15980 1727204147.39817: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15980 1727204147.41624: done processing included file 15980 1727204147.41627: iterating over new_blocks loaded from include file 15980 1727204147.41629: in VariableManager get_vars() 15980 1727204147.41652: done with get_vars() 15980 1727204147.41654: filtering new block on tags 15980 1727204147.41673: done filtering new block on tags 15980 1727204147.41675: in VariableManager get_vars() 15980 1727204147.41799: done with get_vars() 15980 1727204147.41801: filtering new block on tags 15980 1727204147.41822: done filtering new block on tags 15980 1727204147.41824: in VariableManager get_vars() 15980 1727204147.41843: done with get_vars() 15980 1727204147.41845: filtering new block on tags 15980 1727204147.41861: done filtering new block on tags 15980 1727204147.41863: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 15980 1727204147.41973: extending task lists for all hosts with included blocks 15980 1727204147.42926: done extending task lists 15980 1727204147.42928: done processing included files 15980 1727204147.42929: results queue empty 15980 1727204147.42930: checking for any_errors_fatal 15980 1727204147.42931: done checking for any_errors_fatal 15980 1727204147.42932: checking for max_fail_percentage 15980 1727204147.42933: done checking for max_fail_percentage 15980 1727204147.42934: checking to see if all hosts have failed and the running result is not ok 15980 1727204147.42935: done checking to see if all hosts have failed 15980 1727204147.42936: getting the remaining hosts for this loop 15980 1727204147.42937: done getting the remaining hosts for this loop 15980 1727204147.42940: getting the next task for host managed-node2 15980 1727204147.42944: done getting next task for host managed-node2 15980 1727204147.42947: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15980 1727204147.42950: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204147.42961: getting variables 15980 1727204147.42962: in VariableManager get_vars() 15980 1727204147.42983: Calling all_inventory to load vars for managed-node2 15980 1727204147.42985: Calling groups_inventory to load vars for managed-node2 15980 1727204147.42988: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204147.43071: Calling all_plugins_play to load vars for managed-node2 15980 1727204147.43076: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204147.43080: Calling groups_plugins_play to load vars for managed-node2 15980 1727204147.43505: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204147.43949: done with get_vars() 15980 1727204147.43963: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:55:47 -0400 (0:00:00.093) 0:00:08.852 ***** 15980 1727204147.44223: entering _queue_task() for managed-node2/setup 15980 1727204147.45020: worker is 1 (out of 1 available) 15980 1727204147.45150: exiting _queue_task() for managed-node2/setup 15980 1727204147.45163: done queuing things up, now waiting for results queue to drain 15980 1727204147.45167: waiting for pending results... 15980 1727204147.45543: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15980 1727204147.45843: in run() - task 127b8e07-fff9-5f1d-4b72-00000000018d 15980 1727204147.45890: variable 'ansible_search_path' from source: unknown 15980 1727204147.46121: variable 'ansible_search_path' from source: unknown 15980 1727204147.46125: calling self._execute() 15980 1727204147.46257: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204147.46274: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204147.46291: variable 'omit' from source: magic vars 15980 1727204147.47130: variable 'ansible_distribution_major_version' from source: facts 15980 1727204147.47153: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204147.47732: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15980 1727204147.53343: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15980 1727204147.53428: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15980 1727204147.53599: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15980 1727204147.53873: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15980 1727204147.53877: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15980 1727204147.54012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204147.54051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204147.54121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204147.54220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204147.54415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204147.54419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204147.54421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204147.54550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204147.54601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204147.54648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204147.55087: variable '__network_required_facts' from source: role '' defaults 15980 1727204147.55103: variable 'ansible_facts' from source: unknown 15980 1727204147.55319: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 15980 1727204147.55328: when evaluation is False, skipping this task 15980 1727204147.55335: _execute() done 15980 1727204147.55340: dumping result to json 15980 1727204147.55390: done dumping result, returning 15980 1727204147.55404: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [127b8e07-fff9-5f1d-4b72-00000000018d] 15980 1727204147.55413: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000018d 15980 1727204147.55802: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000018d 15980 1727204147.55805: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15980 1727204147.55856: no more pending results, returning what we have 15980 1727204147.55861: results queue empty 15980 1727204147.55862: checking for any_errors_fatal 15980 1727204147.55863: done checking for any_errors_fatal 15980 1727204147.55864: checking for max_fail_percentage 15980 1727204147.55867: done checking for max_fail_percentage 15980 1727204147.55868: checking to see if all hosts have failed and the running result is not ok 15980 1727204147.55869: done checking to see if all hosts have failed 15980 1727204147.55870: getting the remaining hosts for this loop 15980 1727204147.55872: done getting the remaining hosts for this loop 15980 1727204147.55876: getting the next task for host managed-node2 15980 1727204147.55886: done getting next task for host managed-node2 15980 1727204147.55890: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 15980 1727204147.55893: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204147.55909: getting variables 15980 1727204147.55911: in VariableManager get_vars() 15980 1727204147.55951: Calling all_inventory to load vars for managed-node2 15980 1727204147.55954: Calling groups_inventory to load vars for managed-node2 15980 1727204147.55956: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204147.56171: Calling all_plugins_play to load vars for managed-node2 15980 1727204147.56176: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204147.56180: Calling groups_plugins_play to load vars for managed-node2 15980 1727204147.56362: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204147.57309: done with get_vars() 15980 1727204147.57324: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:55:47 -0400 (0:00:00.132) 0:00:08.984 ***** 15980 1727204147.57427: entering _queue_task() for managed-node2/stat 15980 1727204147.58079: worker is 1 (out of 1 available) 15980 1727204147.58094: exiting _queue_task() for managed-node2/stat 15980 1727204147.58107: done queuing things up, now waiting for results queue to drain 15980 1727204147.58109: waiting for pending results... 15980 1727204147.58801: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 15980 1727204147.58934: in run() - task 127b8e07-fff9-5f1d-4b72-00000000018f 15980 1727204147.59020: variable 'ansible_search_path' from source: unknown 15980 1727204147.59029: variable 'ansible_search_path' from source: unknown 15980 1727204147.59109: calling self._execute() 15980 1727204147.59334: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204147.59350: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204147.59368: variable 'omit' from source: magic vars 15980 1727204147.59788: variable 'ansible_distribution_major_version' from source: facts 15980 1727204147.59808: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204147.60013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15980 1727204147.60328: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15980 1727204147.60386: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15980 1727204147.60530: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15980 1727204147.60533: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15980 1727204147.60576: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15980 1727204147.60606: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15980 1727204147.60647: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204147.60682: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15980 1727204147.60788: variable '__network_is_ostree' from source: set_fact 15980 1727204147.60801: Evaluated conditional (not __network_is_ostree is defined): False 15980 1727204147.60810: when evaluation is False, skipping this task 15980 1727204147.60818: _execute() done 15980 1727204147.60825: dumping result to json 15980 1727204147.60833: done dumping result, returning 15980 1727204147.60851: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [127b8e07-fff9-5f1d-4b72-00000000018f] 15980 1727204147.60867: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000018f skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15980 1727204147.61149: no more pending results, returning what we have 15980 1727204147.61153: results queue empty 15980 1727204147.61154: checking for any_errors_fatal 15980 1727204147.61163: done checking for any_errors_fatal 15980 1727204147.61164: checking for max_fail_percentage 15980 1727204147.61167: done checking for max_fail_percentage 15980 1727204147.61169: checking to see if all hosts have failed and the running result is not ok 15980 1727204147.61170: done checking to see if all hosts have failed 15980 1727204147.61171: getting the remaining hosts for this loop 15980 1727204147.61173: done getting the remaining hosts for this loop 15980 1727204147.61179: getting the next task for host managed-node2 15980 1727204147.61187: done getting next task for host managed-node2 15980 1727204147.61191: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15980 1727204147.61195: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204147.61212: getting variables 15980 1727204147.61215: in VariableManager get_vars() 15980 1727204147.61258: Calling all_inventory to load vars for managed-node2 15980 1727204147.61261: Calling groups_inventory to load vars for managed-node2 15980 1727204147.61264: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204147.61392: Calling all_plugins_play to load vars for managed-node2 15980 1727204147.61396: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204147.61399: Calling groups_plugins_play to load vars for managed-node2 15980 1727204147.61692: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000018f 15980 1727204147.61695: WORKER PROCESS EXITING 15980 1727204147.61728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204147.61953: done with get_vars() 15980 1727204147.61969: done getting variables 15980 1727204147.62040: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:55:47 -0400 (0:00:00.046) 0:00:09.031 ***** 15980 1727204147.62083: entering _queue_task() for managed-node2/set_fact 15980 1727204147.62443: worker is 1 (out of 1 available) 15980 1727204147.62459: exiting _queue_task() for managed-node2/set_fact 15980 1727204147.62581: done queuing things up, now waiting for results queue to drain 15980 1727204147.62584: waiting for pending results... 15980 1727204147.62782: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15980 1727204147.62943: in run() - task 127b8e07-fff9-5f1d-4b72-000000000190 15980 1727204147.62969: variable 'ansible_search_path' from source: unknown 15980 1727204147.63272: variable 'ansible_search_path' from source: unknown 15980 1727204147.63279: calling self._execute() 15980 1727204147.63618: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204147.63622: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204147.63625: variable 'omit' from source: magic vars 15980 1727204147.64375: variable 'ansible_distribution_major_version' from source: facts 15980 1727204147.64381: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204147.64681: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15980 1727204147.65540: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15980 1727204147.65771: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15980 1727204147.65775: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15980 1727204147.65778: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15980 1727204147.66002: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15980 1727204147.66175: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15980 1727204147.66194: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204147.66229: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15980 1727204147.66483: variable '__network_is_ostree' from source: set_fact 15980 1727204147.66502: Evaluated conditional (not __network_is_ostree is defined): False 15980 1727204147.66511: when evaluation is False, skipping this task 15980 1727204147.66518: _execute() done 15980 1727204147.66524: dumping result to json 15980 1727204147.66532: done dumping result, returning 15980 1727204147.66544: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [127b8e07-fff9-5f1d-4b72-000000000190] 15980 1727204147.66573: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000190 skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15980 1727204147.66947: no more pending results, returning what we have 15980 1727204147.66951: results queue empty 15980 1727204147.66952: checking for any_errors_fatal 15980 1727204147.66957: done checking for any_errors_fatal 15980 1727204147.66958: checking for max_fail_percentage 15980 1727204147.66959: done checking for max_fail_percentage 15980 1727204147.66961: checking to see if all hosts have failed and the running result is not ok 15980 1727204147.66962: done checking to see if all hosts have failed 15980 1727204147.66963: getting the remaining hosts for this loop 15980 1727204147.66968: done getting the remaining hosts for this loop 15980 1727204147.66973: getting the next task for host managed-node2 15980 1727204147.66983: done getting next task for host managed-node2 15980 1727204147.66987: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 15980 1727204147.66991: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204147.67006: getting variables 15980 1727204147.67008: in VariableManager get_vars() 15980 1727204147.67052: Calling all_inventory to load vars for managed-node2 15980 1727204147.67056: Calling groups_inventory to load vars for managed-node2 15980 1727204147.67059: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204147.67276: Calling all_plugins_play to load vars for managed-node2 15980 1727204147.67281: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204147.67285: Calling groups_plugins_play to load vars for managed-node2 15980 1727204147.67949: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204147.68326: done with get_vars() 15980 1727204147.68342: done getting variables 15980 1727204147.68497: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000190 15980 1727204147.68502: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:55:47 -0400 (0:00:00.066) 0:00:09.097 ***** 15980 1727204147.68697: entering _queue_task() for managed-node2/service_facts 15980 1727204147.68700: Creating lock for service_facts 15980 1727204147.69464: worker is 1 (out of 1 available) 15980 1727204147.69483: exiting _queue_task() for managed-node2/service_facts 15980 1727204147.69496: done queuing things up, now waiting for results queue to drain 15980 1727204147.69498: waiting for pending results... 15980 1727204147.69985: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 15980 1727204147.70220: in run() - task 127b8e07-fff9-5f1d-4b72-000000000192 15980 1727204147.70240: variable 'ansible_search_path' from source: unknown 15980 1727204147.70245: variable 'ansible_search_path' from source: unknown 15980 1727204147.70573: calling self._execute() 15980 1727204147.70602: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204147.70609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204147.70620: variable 'omit' from source: magic vars 15980 1727204147.71564: variable 'ansible_distribution_major_version' from source: facts 15980 1727204147.71580: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204147.71588: variable 'omit' from source: magic vars 15980 1727204147.71861: variable 'omit' from source: magic vars 15980 1727204147.71910: variable 'omit' from source: magic vars 15980 1727204147.72013: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204147.72123: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204147.72214: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204147.72237: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204147.72255: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204147.72371: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204147.72375: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204147.72378: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204147.72608: Set connection var ansible_connection to ssh 15980 1727204147.72615: Set connection var ansible_pipelining to False 15980 1727204147.72618: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204147.72625: Set connection var ansible_timeout to 10 15980 1727204147.72750: Set connection var ansible_shell_type to sh 15980 1727204147.72757: Set connection var ansible_shell_executable to /bin/sh 15980 1727204147.72925: variable 'ansible_shell_executable' from source: unknown 15980 1727204147.72934: variable 'ansible_connection' from source: unknown 15980 1727204147.72937: variable 'ansible_module_compression' from source: unknown 15980 1727204147.72942: variable 'ansible_shell_type' from source: unknown 15980 1727204147.72944: variable 'ansible_shell_executable' from source: unknown 15980 1727204147.72949: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204147.72954: variable 'ansible_pipelining' from source: unknown 15980 1727204147.72957: variable 'ansible_timeout' from source: unknown 15980 1727204147.72962: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204147.73924: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15980 1727204147.73939: variable 'omit' from source: magic vars 15980 1727204147.73943: starting attempt loop 15980 1727204147.73946: running the handler 15980 1727204147.73963: _low_level_execute_command(): starting 15980 1727204147.73974: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15980 1727204147.76121: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204147.76154: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204147.76158: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204147.76330: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204147.78108: stdout chunk (state=3): >>>/root <<< 15980 1727204147.78186: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204147.78308: stderr chunk (state=3): >>><<< 15980 1727204147.78319: stdout chunk (state=3): >>><<< 15980 1727204147.78476: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204147.78479: _low_level_execute_command(): starting 15980 1727204147.78483: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204147.7842689-17074-143856370473933 `" && echo ansible-tmp-1727204147.7842689-17074-143856370473933="` echo /root/.ansible/tmp/ansible-tmp-1727204147.7842689-17074-143856370473933 `" ) && sleep 0' 15980 1727204147.80131: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204147.80228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204147.80372: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204147.80758: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204147.82717: stdout chunk (state=3): >>>ansible-tmp-1727204147.7842689-17074-143856370473933=/root/.ansible/tmp/ansible-tmp-1727204147.7842689-17074-143856370473933 <<< 15980 1727204147.82871: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204147.83029: stderr chunk (state=3): >>><<< 15980 1727204147.83033: stdout chunk (state=3): >>><<< 15980 1727204147.83182: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204147.7842689-17074-143856370473933=/root/.ansible/tmp/ansible-tmp-1727204147.7842689-17074-143856370473933 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204147.83254: variable 'ansible_module_compression' from source: unknown 15980 1727204147.83324: ANSIBALLZ: Using lock for service_facts 15980 1727204147.83328: ANSIBALLZ: Acquiring lock 15980 1727204147.83401: ANSIBALLZ: Lock acquired: 139981194714752 15980 1727204147.83404: ANSIBALLZ: Creating module 15980 1727204148.02013: ANSIBALLZ: Writing module into payload 15980 1727204148.02132: ANSIBALLZ: Writing module 15980 1727204148.02164: ANSIBALLZ: Renaming module 15980 1727204148.02363: ANSIBALLZ: Done creating module 15980 1727204148.02368: variable 'ansible_facts' from source: unknown 15980 1727204148.02370: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204147.7842689-17074-143856370473933/AnsiballZ_service_facts.py 15980 1727204148.02614: Sending initial data 15980 1727204148.02617: Sent initial data (162 bytes) 15980 1727204148.03536: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204148.03542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204148.03591: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204148.03602: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204148.03690: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204148.05395: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15980 1727204148.05460: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15980 1727204148.05534: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15980vtkmnsww/tmp6z8z_23s /root/.ansible/tmp/ansible-tmp-1727204147.7842689-17074-143856370473933/AnsiballZ_service_facts.py <<< 15980 1727204148.05538: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204147.7842689-17074-143856370473933/AnsiballZ_service_facts.py" <<< 15980 1727204148.05605: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15980vtkmnsww/tmp6z8z_23s" to remote "/root/.ansible/tmp/ansible-tmp-1727204147.7842689-17074-143856370473933/AnsiballZ_service_facts.py" <<< 15980 1727204148.05609: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204147.7842689-17074-143856370473933/AnsiballZ_service_facts.py" <<< 15980 1727204148.06297: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204148.06377: stderr chunk (state=3): >>><<< 15980 1727204148.06381: stdout chunk (state=3): >>><<< 15980 1727204148.06402: done transferring module to remote 15980 1727204148.06419: _low_level_execute_command(): starting 15980 1727204148.06423: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204147.7842689-17074-143856370473933/ /root/.ansible/tmp/ansible-tmp-1727204147.7842689-17074-143856370473933/AnsiballZ_service_facts.py && sleep 0' 15980 1727204148.06918: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204148.06923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204148.06928: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204148.06930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204148.06988: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204148.06992: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204148.07071: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204148.08993: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204148.09025: stderr chunk (state=3): >>><<< 15980 1727204148.09029: stdout chunk (state=3): >>><<< 15980 1727204148.09051: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204148.09091: _low_level_execute_command(): starting 15980 1727204148.09100: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204147.7842689-17074-143856370473933/AnsiballZ_service_facts.py && sleep 0' 15980 1727204148.09686: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204148.09695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15980 1727204148.09699: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204148.09766: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204148.09771: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204148.09870: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204150.26312: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"<<< 15980 1727204150.26332: stdout chunk (state=3): >>>name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "sourc<<< 15980 1727204150.26347: stdout chunk (state=3): >>>e": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "deb<<< 15980 1727204150.26384: stdout chunk (state=3): >>>ug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plym<<< 15980 1727204150.26413: stdout chunk (state=3): >>>outh-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 15980 1727204150.28175: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 15980 1727204150.28179: stdout chunk (state=3): >>><<< 15980 1727204150.28181: stderr chunk (state=3): >>><<< 15980 1727204150.28191: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 15980 1727204150.29521: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204147.7842689-17074-143856370473933/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15980 1727204150.29531: _low_level_execute_command(): starting 15980 1727204150.29537: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204147.7842689-17074-143856370473933/ > /dev/null 2>&1 && sleep 0' 15980 1727204150.30034: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204150.30038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204150.30041: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204150.30052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204150.30107: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204150.30111: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204150.30136: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204150.30213: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204150.32326: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204150.32331: stdout chunk (state=3): >>><<< 15980 1727204150.32333: stderr chunk (state=3): >>><<< 15980 1727204150.32350: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204150.32472: handler run complete 15980 1727204150.32977: variable 'ansible_facts' from source: unknown 15980 1727204150.33187: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204150.33582: variable 'ansible_facts' from source: unknown 15980 1727204150.33687: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204150.33843: attempt loop complete, returning result 15980 1727204150.33847: _execute() done 15980 1727204150.33849: dumping result to json 15980 1727204150.33892: done dumping result, returning 15980 1727204150.33901: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [127b8e07-fff9-5f1d-4b72-000000000192] 15980 1727204150.33911: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000192 15980 1727204150.35055: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000192 15980 1727204150.35059: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15980 1727204150.35141: no more pending results, returning what we have 15980 1727204150.35145: results queue empty 15980 1727204150.35146: checking for any_errors_fatal 15980 1727204150.35149: done checking for any_errors_fatal 15980 1727204150.35150: checking for max_fail_percentage 15980 1727204150.35151: done checking for max_fail_percentage 15980 1727204150.35152: checking to see if all hosts have failed and the running result is not ok 15980 1727204150.35153: done checking to see if all hosts have failed 15980 1727204150.35154: getting the remaining hosts for this loop 15980 1727204150.35155: done getting the remaining hosts for this loop 15980 1727204150.35167: getting the next task for host managed-node2 15980 1727204150.35175: done getting next task for host managed-node2 15980 1727204150.35179: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 15980 1727204150.35181: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204150.35200: getting variables 15980 1727204150.35202: in VariableManager get_vars() 15980 1727204150.35262: Calling all_inventory to load vars for managed-node2 15980 1727204150.35268: Calling groups_inventory to load vars for managed-node2 15980 1727204150.35271: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204150.35289: Calling all_plugins_play to load vars for managed-node2 15980 1727204150.35293: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204150.35297: Calling groups_plugins_play to load vars for managed-node2 15980 1727204150.35786: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204150.36298: done with get_vars() 15980 1727204150.36316: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:55:50 -0400 (0:00:02.677) 0:00:11.774 ***** 15980 1727204150.36462: entering _queue_task() for managed-node2/package_facts 15980 1727204150.36464: Creating lock for package_facts 15980 1727204150.36974: worker is 1 (out of 1 available) 15980 1727204150.36993: exiting _queue_task() for managed-node2/package_facts 15980 1727204150.37013: done queuing things up, now waiting for results queue to drain 15980 1727204150.37015: waiting for pending results... 15980 1727204150.37901: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 15980 1727204150.38493: in run() - task 127b8e07-fff9-5f1d-4b72-000000000193 15980 1727204150.38499: variable 'ansible_search_path' from source: unknown 15980 1727204150.38502: variable 'ansible_search_path' from source: unknown 15980 1727204150.38549: calling self._execute() 15980 1727204150.38682: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204150.38710: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204150.38772: variable 'omit' from source: magic vars 15980 1727204150.39304: variable 'ansible_distribution_major_version' from source: facts 15980 1727204150.39325: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204150.39346: variable 'omit' from source: magic vars 15980 1727204150.39423: variable 'omit' from source: magic vars 15980 1727204150.39485: variable 'omit' from source: magic vars 15980 1727204150.39534: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204150.39667: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204150.39672: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204150.39674: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204150.39677: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204150.39716: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204150.39725: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204150.39734: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204150.39870: Set connection var ansible_connection to ssh 15980 1727204150.39884: Set connection var ansible_pipelining to False 15980 1727204150.40010: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204150.40016: Set connection var ansible_timeout to 10 15980 1727204150.40019: Set connection var ansible_shell_type to sh 15980 1727204150.40021: Set connection var ansible_shell_executable to /bin/sh 15980 1727204150.40024: variable 'ansible_shell_executable' from source: unknown 15980 1727204150.40026: variable 'ansible_connection' from source: unknown 15980 1727204150.40029: variable 'ansible_module_compression' from source: unknown 15980 1727204150.40032: variable 'ansible_shell_type' from source: unknown 15980 1727204150.40034: variable 'ansible_shell_executable' from source: unknown 15980 1727204150.40036: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204150.40039: variable 'ansible_pipelining' from source: unknown 15980 1727204150.40041: variable 'ansible_timeout' from source: unknown 15980 1727204150.40043: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204150.40337: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15980 1727204150.40343: variable 'omit' from source: magic vars 15980 1727204150.40346: starting attempt loop 15980 1727204150.40374: running the handler 15980 1727204150.40445: _low_level_execute_command(): starting 15980 1727204150.40449: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15980 1727204150.41406: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204150.41580: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204150.41657: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204150.43413: stdout chunk (state=3): >>>/root <<< 15980 1727204150.43519: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204150.43586: stderr chunk (state=3): >>><<< 15980 1727204150.43590: stdout chunk (state=3): >>><<< 15980 1727204150.43616: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204150.43638: _low_level_execute_command(): starting 15980 1727204150.43677: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204150.4361546-17155-113283739950465 `" && echo ansible-tmp-1727204150.4361546-17155-113283739950465="` echo /root/.ansible/tmp/ansible-tmp-1727204150.4361546-17155-113283739950465 `" ) && sleep 0' 15980 1727204150.44330: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204150.44335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204150.44346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204150.44416: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204150.44492: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204150.46460: stdout chunk (state=3): >>>ansible-tmp-1727204150.4361546-17155-113283739950465=/root/.ansible/tmp/ansible-tmp-1727204150.4361546-17155-113283739950465 <<< 15980 1727204150.46585: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204150.46677: stderr chunk (state=3): >>><<< 15980 1727204150.46683: stdout chunk (state=3): >>><<< 15980 1727204150.46759: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204150.4361546-17155-113283739950465=/root/.ansible/tmp/ansible-tmp-1727204150.4361546-17155-113283739950465 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204150.46791: variable 'ansible_module_compression' from source: unknown 15980 1727204150.46852: ANSIBALLZ: Using lock for package_facts 15980 1727204150.46856: ANSIBALLZ: Acquiring lock 15980 1727204150.46858: ANSIBALLZ: Lock acquired: 139981195376544 15980 1727204150.46876: ANSIBALLZ: Creating module 15980 1727204150.81551: ANSIBALLZ: Writing module into payload 15980 1727204150.81555: ANSIBALLZ: Writing module 15980 1727204150.81557: ANSIBALLZ: Renaming module 15980 1727204150.81560: ANSIBALLZ: Done creating module 15980 1727204150.81562: variable 'ansible_facts' from source: unknown 15980 1727204150.81704: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204150.4361546-17155-113283739950465/AnsiballZ_package_facts.py 15980 1727204150.81890: Sending initial data 15980 1727204150.81894: Sent initial data (162 bytes) 15980 1727204150.82635: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204150.82683: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204150.82732: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204150.82744: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204150.82782: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204150.82857: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204150.84582: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15980 1727204150.84679: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15980 1727204150.84784: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15980vtkmnsww/tmp7amnxzzp /root/.ansible/tmp/ansible-tmp-1727204150.4361546-17155-113283739950465/AnsiballZ_package_facts.py <<< 15980 1727204150.84796: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204150.4361546-17155-113283739950465/AnsiballZ_package_facts.py" <<< 15980 1727204150.84863: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15980vtkmnsww/tmp7amnxzzp" to remote "/root/.ansible/tmp/ansible-tmp-1727204150.4361546-17155-113283739950465/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204150.4361546-17155-113283739950465/AnsiballZ_package_facts.py" <<< 15980 1727204150.86402: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204150.86434: stderr chunk (state=3): >>><<< 15980 1727204150.86438: stdout chunk (state=3): >>><<< 15980 1727204150.86462: done transferring module to remote 15980 1727204150.86478: _low_level_execute_command(): starting 15980 1727204150.86482: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204150.4361546-17155-113283739950465/ /root/.ansible/tmp/ansible-tmp-1727204150.4361546-17155-113283739950465/AnsiballZ_package_facts.py && sleep 0' 15980 1727204150.87061: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15980 1727204150.87121: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204150.87133: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204150.87159: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204150.87275: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204150.89230: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204150.89235: stdout chunk (state=3): >>><<< 15980 1727204150.89237: stderr chunk (state=3): >>><<< 15980 1727204150.89349: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204150.89354: _low_level_execute_command(): starting 15980 1727204150.89358: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204150.4361546-17155-113283739950465/AnsiballZ_package_facts.py && sleep 0' 15980 1727204150.89828: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204150.89832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204150.89835: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204150.89837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204150.89839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204150.89895: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204150.89903: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204150.89906: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204150.89984: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204151.52493: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, <<< 15980 1727204151.52507: stdout chunk (state=3): >>>"arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40",<<< 15980 1727204151.52597: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_<<< 15980 1727204151.52654: stdout chunk (state=3): >>>64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoc<<< 15980 1727204151.52688: stdout chunk (state=3): >>>h": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "s<<< 15980 1727204151.52691: stdout chunk (state=3): >>>ource": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 15980 1727204151.54942: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 15980 1727204151.54963: stderr chunk (state=3): >>><<< 15980 1727204151.54975: stdout chunk (state=3): >>><<< 15980 1727204151.55098: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 15980 1727204151.61474: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204150.4361546-17155-113283739950465/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15980 1727204151.61479: _low_level_execute_command(): starting 15980 1727204151.61482: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204150.4361546-17155-113283739950465/ > /dev/null 2>&1 && sleep 0' 15980 1727204151.62391: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204151.62513: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204151.62529: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204151.62584: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204151.62739: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204151.64713: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204151.64997: stderr chunk (state=3): >>><<< 15980 1727204151.65001: stdout chunk (state=3): >>><<< 15980 1727204151.65019: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204151.65028: handler run complete 15980 1727204151.67458: variable 'ansible_facts' from source: unknown 15980 1727204151.68785: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204151.71958: variable 'ansible_facts' from source: unknown 15980 1727204151.72571: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204151.74191: attempt loop complete, returning result 15980 1727204151.74196: _execute() done 15980 1727204151.74198: dumping result to json 15980 1727204151.74574: done dumping result, returning 15980 1727204151.74578: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [127b8e07-fff9-5f1d-4b72-000000000193] 15980 1727204151.74581: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000193 15980 1727204151.85121: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000193 15980 1727204151.85128: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15980 1727204151.85189: no more pending results, returning what we have 15980 1727204151.85192: results queue empty 15980 1727204151.85193: checking for any_errors_fatal 15980 1727204151.85198: done checking for any_errors_fatal 15980 1727204151.85199: checking for max_fail_percentage 15980 1727204151.85200: done checking for max_fail_percentage 15980 1727204151.85201: checking to see if all hosts have failed and the running result is not ok 15980 1727204151.85202: done checking to see if all hosts have failed 15980 1727204151.85203: getting the remaining hosts for this loop 15980 1727204151.85204: done getting the remaining hosts for this loop 15980 1727204151.85209: getting the next task for host managed-node2 15980 1727204151.85216: done getting next task for host managed-node2 15980 1727204151.85220: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 15980 1727204151.85222: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204151.85302: getting variables 15980 1727204151.85305: in VariableManager get_vars() 15980 1727204151.85420: Calling all_inventory to load vars for managed-node2 15980 1727204151.85424: Calling groups_inventory to load vars for managed-node2 15980 1727204151.85427: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204151.85438: Calling all_plugins_play to load vars for managed-node2 15980 1727204151.85441: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204151.85562: Calling groups_plugins_play to load vars for managed-node2 15980 1727204151.87381: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204151.90937: done with get_vars() 15980 1727204151.90978: done getting variables 15980 1727204151.91064: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:55:51 -0400 (0:00:01.546) 0:00:13.321 ***** 15980 1727204151.91103: entering _queue_task() for managed-node2/debug 15980 1727204151.91518: worker is 1 (out of 1 available) 15980 1727204151.91532: exiting _queue_task() for managed-node2/debug 15980 1727204151.91547: done queuing things up, now waiting for results queue to drain 15980 1727204151.91550: waiting for pending results... 15980 1727204151.91898: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 15980 1727204151.91993: in run() - task 127b8e07-fff9-5f1d-4b72-000000000015 15980 1727204151.91999: variable 'ansible_search_path' from source: unknown 15980 1727204151.92002: variable 'ansible_search_path' from source: unknown 15980 1727204151.92254: calling self._execute() 15980 1727204151.92562: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204151.92585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204151.92659: variable 'omit' from source: magic vars 15980 1727204151.93708: variable 'ansible_distribution_major_version' from source: facts 15980 1727204151.93898: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204151.93902: variable 'omit' from source: magic vars 15980 1727204151.93904: variable 'omit' from source: magic vars 15980 1727204151.94288: variable 'network_provider' from source: set_fact 15980 1727204151.94301: variable 'omit' from source: magic vars 15980 1727204151.94304: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204151.94861: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204151.95094: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204151.95099: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204151.95102: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204151.95495: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204151.95499: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204151.95502: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204151.95615: Set connection var ansible_connection to ssh 15980 1727204151.95803: Set connection var ansible_pipelining to False 15980 1727204151.95808: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204151.95811: Set connection var ansible_timeout to 10 15980 1727204151.95813: Set connection var ansible_shell_type to sh 15980 1727204151.95816: Set connection var ansible_shell_executable to /bin/sh 15980 1727204151.95985: variable 'ansible_shell_executable' from source: unknown 15980 1727204151.95989: variable 'ansible_connection' from source: unknown 15980 1727204151.95992: variable 'ansible_module_compression' from source: unknown 15980 1727204151.96015: variable 'ansible_shell_type' from source: unknown 15980 1727204151.96019: variable 'ansible_shell_executable' from source: unknown 15980 1727204151.96068: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204151.96073: variable 'ansible_pipelining' from source: unknown 15980 1727204151.96077: variable 'ansible_timeout' from source: unknown 15980 1727204151.96086: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204151.96400: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15980 1727204151.96472: variable 'omit' from source: magic vars 15980 1727204151.96484: starting attempt loop 15980 1727204151.96489: running the handler 15980 1727204151.96508: handler run complete 15980 1727204151.96530: attempt loop complete, returning result 15980 1727204151.96539: _execute() done 15980 1727204151.96547: dumping result to json 15980 1727204151.96559: done dumping result, returning 15980 1727204151.96579: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [127b8e07-fff9-5f1d-4b72-000000000015] 15980 1727204151.96623: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000015 ok: [managed-node2] => {} MSG: Using network provider: nm 15980 1727204151.97075: no more pending results, returning what we have 15980 1727204151.97078: results queue empty 15980 1727204151.97079: checking for any_errors_fatal 15980 1727204151.97085: done checking for any_errors_fatal 15980 1727204151.97086: checking for max_fail_percentage 15980 1727204151.97087: done checking for max_fail_percentage 15980 1727204151.97088: checking to see if all hosts have failed and the running result is not ok 15980 1727204151.97089: done checking to see if all hosts have failed 15980 1727204151.97090: getting the remaining hosts for this loop 15980 1727204151.97091: done getting the remaining hosts for this loop 15980 1727204151.97095: getting the next task for host managed-node2 15980 1727204151.97100: done getting next task for host managed-node2 15980 1727204151.97104: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15980 1727204151.97107: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204151.97116: getting variables 15980 1727204151.97117: in VariableManager get_vars() 15980 1727204151.97153: Calling all_inventory to load vars for managed-node2 15980 1727204151.97157: Calling groups_inventory to load vars for managed-node2 15980 1727204151.97159: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204151.97170: Calling all_plugins_play to load vars for managed-node2 15980 1727204151.97173: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204151.97176: Calling groups_plugins_play to load vars for managed-node2 15980 1727204151.97820: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000015 15980 1727204151.97824: WORKER PROCESS EXITING 15980 1727204152.00076: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204152.03906: done with get_vars() 15980 1727204152.03949: done getting variables 15980 1727204152.04055: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:55:52 -0400 (0:00:00.129) 0:00:13.451 ***** 15980 1727204152.04155: entering _queue_task() for managed-node2/fail 15980 1727204152.04158: Creating lock for fail 15980 1727204152.04539: worker is 1 (out of 1 available) 15980 1727204152.04553: exiting _queue_task() for managed-node2/fail 15980 1727204152.04769: done queuing things up, now waiting for results queue to drain 15980 1727204152.04772: waiting for pending results... 15980 1727204152.05151: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15980 1727204152.05331: in run() - task 127b8e07-fff9-5f1d-4b72-000000000016 15980 1727204152.05353: variable 'ansible_search_path' from source: unknown 15980 1727204152.05363: variable 'ansible_search_path' from source: unknown 15980 1727204152.05492: calling self._execute() 15980 1727204152.05707: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204152.05744: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204152.05747: variable 'omit' from source: magic vars 15980 1727204152.06257: variable 'ansible_distribution_major_version' from source: facts 15980 1727204152.06371: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204152.06433: variable 'network_state' from source: role '' defaults 15980 1727204152.06451: Evaluated conditional (network_state != {}): False 15980 1727204152.06459: when evaluation is False, skipping this task 15980 1727204152.06468: _execute() done 15980 1727204152.06476: dumping result to json 15980 1727204152.06484: done dumping result, returning 15980 1727204152.06535: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [127b8e07-fff9-5f1d-4b72-000000000016] 15980 1727204152.06548: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000016 15980 1727204152.06907: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000016 15980 1727204152.06911: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15980 1727204152.07035: no more pending results, returning what we have 15980 1727204152.07040: results queue empty 15980 1727204152.07041: checking for any_errors_fatal 15980 1727204152.07047: done checking for any_errors_fatal 15980 1727204152.07051: checking for max_fail_percentage 15980 1727204152.07053: done checking for max_fail_percentage 15980 1727204152.07054: checking to see if all hosts have failed and the running result is not ok 15980 1727204152.07055: done checking to see if all hosts have failed 15980 1727204152.07056: getting the remaining hosts for this loop 15980 1727204152.07058: done getting the remaining hosts for this loop 15980 1727204152.07106: getting the next task for host managed-node2 15980 1727204152.07112: done getting next task for host managed-node2 15980 1727204152.07116: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15980 1727204152.07119: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204152.07136: getting variables 15980 1727204152.07138: in VariableManager get_vars() 15980 1727204152.07176: Calling all_inventory to load vars for managed-node2 15980 1727204152.07179: Calling groups_inventory to load vars for managed-node2 15980 1727204152.07181: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204152.07190: Calling all_plugins_play to load vars for managed-node2 15980 1727204152.07193: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204152.07195: Calling groups_plugins_play to load vars for managed-node2 15980 1727204152.09459: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204152.11071: done with get_vars() 15980 1727204152.11101: done getting variables 15980 1727204152.11154: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:55:52 -0400 (0:00:00.070) 0:00:13.522 ***** 15980 1727204152.11184: entering _queue_task() for managed-node2/fail 15980 1727204152.11461: worker is 1 (out of 1 available) 15980 1727204152.11475: exiting _queue_task() for managed-node2/fail 15980 1727204152.11490: done queuing things up, now waiting for results queue to drain 15980 1727204152.11492: waiting for pending results... 15980 1727204152.11676: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15980 1727204152.11754: in run() - task 127b8e07-fff9-5f1d-4b72-000000000017 15980 1727204152.11767: variable 'ansible_search_path' from source: unknown 15980 1727204152.11771: variable 'ansible_search_path' from source: unknown 15980 1727204152.11806: calling self._execute() 15980 1727204152.11884: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204152.11890: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204152.11899: variable 'omit' from source: magic vars 15980 1727204152.12202: variable 'ansible_distribution_major_version' from source: facts 15980 1727204152.12213: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204152.12308: variable 'network_state' from source: role '' defaults 15980 1727204152.12316: Evaluated conditional (network_state != {}): False 15980 1727204152.12320: when evaluation is False, skipping this task 15980 1727204152.12323: _execute() done 15980 1727204152.12328: dumping result to json 15980 1727204152.12332: done dumping result, returning 15980 1727204152.12336: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [127b8e07-fff9-5f1d-4b72-000000000017] 15980 1727204152.12341: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000017 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15980 1727204152.12515: no more pending results, returning what we have 15980 1727204152.12519: results queue empty 15980 1727204152.12520: checking for any_errors_fatal 15980 1727204152.12532: done checking for any_errors_fatal 15980 1727204152.12532: checking for max_fail_percentage 15980 1727204152.12534: done checking for max_fail_percentage 15980 1727204152.12535: checking to see if all hosts have failed and the running result is not ok 15980 1727204152.12536: done checking to see if all hosts have failed 15980 1727204152.12537: getting the remaining hosts for this loop 15980 1727204152.12538: done getting the remaining hosts for this loop 15980 1727204152.12543: getting the next task for host managed-node2 15980 1727204152.12549: done getting next task for host managed-node2 15980 1727204152.12553: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15980 1727204152.12555: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204152.12579: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000017 15980 1727204152.12581: WORKER PROCESS EXITING 15980 1727204152.12592: getting variables 15980 1727204152.12593: in VariableManager get_vars() 15980 1727204152.12633: Calling all_inventory to load vars for managed-node2 15980 1727204152.12635: Calling groups_inventory to load vars for managed-node2 15980 1727204152.12637: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204152.12648: Calling all_plugins_play to load vars for managed-node2 15980 1727204152.12650: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204152.12653: Calling groups_plugins_play to load vars for managed-node2 15980 1727204152.14475: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204152.16891: done with get_vars() 15980 1727204152.16925: done getting variables 15980 1727204152.16990: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:55:52 -0400 (0:00:00.058) 0:00:13.580 ***** 15980 1727204152.17021: entering _queue_task() for managed-node2/fail 15980 1727204152.17374: worker is 1 (out of 1 available) 15980 1727204152.17390: exiting _queue_task() for managed-node2/fail 15980 1727204152.17405: done queuing things up, now waiting for results queue to drain 15980 1727204152.17407: waiting for pending results... 15980 1727204152.17670: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15980 1727204152.17861: in run() - task 127b8e07-fff9-5f1d-4b72-000000000018 15980 1727204152.17867: variable 'ansible_search_path' from source: unknown 15980 1727204152.17870: variable 'ansible_search_path' from source: unknown 15980 1727204152.17873: calling self._execute() 15980 1727204152.17971: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204152.17984: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204152.17996: variable 'omit' from source: magic vars 15980 1727204152.18392: variable 'ansible_distribution_major_version' from source: facts 15980 1727204152.18438: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204152.18637: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15980 1727204152.21089: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15980 1727204152.21159: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15980 1727204152.21200: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15980 1727204152.21237: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15980 1727204152.21269: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15980 1727204152.21360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204152.21447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204152.21450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204152.21468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204152.21488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204152.21601: variable 'ansible_distribution_major_version' from source: facts 15980 1727204152.21620: Evaluated conditional (ansible_distribution_major_version | int > 9): True 15980 1727204152.21755: variable 'ansible_distribution' from source: facts 15980 1727204152.21759: variable '__network_rh_distros' from source: role '' defaults 15980 1727204152.21772: Evaluated conditional (ansible_distribution in __network_rh_distros): False 15980 1727204152.21776: when evaluation is False, skipping this task 15980 1727204152.21780: _execute() done 15980 1727204152.21782: dumping result to json 15980 1727204152.21785: done dumping result, returning 15980 1727204152.21792: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [127b8e07-fff9-5f1d-4b72-000000000018] 15980 1727204152.21800: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000018 15980 1727204152.21957: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000018 15980 1727204152.21961: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 15980 1727204152.22015: no more pending results, returning what we have 15980 1727204152.22020: results queue empty 15980 1727204152.22021: checking for any_errors_fatal 15980 1727204152.22027: done checking for any_errors_fatal 15980 1727204152.22028: checking for max_fail_percentage 15980 1727204152.22029: done checking for max_fail_percentage 15980 1727204152.22030: checking to see if all hosts have failed and the running result is not ok 15980 1727204152.22031: done checking to see if all hosts have failed 15980 1727204152.22032: getting the remaining hosts for this loop 15980 1727204152.22034: done getting the remaining hosts for this loop 15980 1727204152.22039: getting the next task for host managed-node2 15980 1727204152.22047: done getting next task for host managed-node2 15980 1727204152.22053: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15980 1727204152.22055: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204152.22073: getting variables 15980 1727204152.22075: in VariableManager get_vars() 15980 1727204152.22118: Calling all_inventory to load vars for managed-node2 15980 1727204152.22122: Calling groups_inventory to load vars for managed-node2 15980 1727204152.22124: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204152.22136: Calling all_plugins_play to load vars for managed-node2 15980 1727204152.22140: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204152.22143: Calling groups_plugins_play to load vars for managed-node2 15980 1727204152.24091: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204152.26196: done with get_vars() 15980 1727204152.26233: done getting variables 15980 1727204152.26343: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:55:52 -0400 (0:00:00.093) 0:00:13.674 ***** 15980 1727204152.26377: entering _queue_task() for managed-node2/dnf 15980 1727204152.26720: worker is 1 (out of 1 available) 15980 1727204152.26735: exiting _queue_task() for managed-node2/dnf 15980 1727204152.26750: done queuing things up, now waiting for results queue to drain 15980 1727204152.26753: waiting for pending results... 15980 1727204152.27138: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15980 1727204152.27190: in run() - task 127b8e07-fff9-5f1d-4b72-000000000019 15980 1727204152.27196: variable 'ansible_search_path' from source: unknown 15980 1727204152.27200: variable 'ansible_search_path' from source: unknown 15980 1727204152.27251: calling self._execute() 15980 1727204152.27339: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204152.27344: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204152.27353: variable 'omit' from source: magic vars 15980 1727204152.27776: variable 'ansible_distribution_major_version' from source: facts 15980 1727204152.27790: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204152.28023: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15980 1727204152.30604: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15980 1727204152.30803: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15980 1727204152.30839: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15980 1727204152.30985: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15980 1727204152.31021: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15980 1727204152.31311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204152.31346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204152.31532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204152.31580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204152.31596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204152.31724: variable 'ansible_distribution' from source: facts 15980 1727204152.31731: variable 'ansible_distribution_major_version' from source: facts 15980 1727204152.31739: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 15980 1727204152.32031: variable '__network_wireless_connections_defined' from source: role '' defaults 15980 1727204152.32341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204152.32345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204152.32348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204152.32350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204152.32353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204152.32355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204152.32388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204152.32409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204152.32483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204152.32500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204152.32662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204152.32667: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204152.32671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204152.32674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204152.32677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204152.32820: variable 'network_connections' from source: play vars 15980 1727204152.32834: variable 'interface' from source: set_fact 15980 1727204152.32913: variable 'interface' from source: set_fact 15980 1727204152.32922: variable 'interface' from source: set_fact 15980 1727204152.32986: variable 'interface' from source: set_fact 15980 1727204152.33061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15980 1727204152.33475: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15980 1727204152.33479: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15980 1727204152.33481: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15980 1727204152.33484: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15980 1727204152.33486: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15980 1727204152.33488: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15980 1727204152.33499: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204152.33571: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15980 1727204152.33575: variable '__network_team_connections_defined' from source: role '' defaults 15980 1727204152.33949: variable 'network_connections' from source: play vars 15980 1727204152.33954: variable 'interface' from source: set_fact 15980 1727204152.34023: variable 'interface' from source: set_fact 15980 1727204152.34030: variable 'interface' from source: set_fact 15980 1727204152.34137: variable 'interface' from source: set_fact 15980 1727204152.34164: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15980 1727204152.34170: when evaluation is False, skipping this task 15980 1727204152.34173: _execute() done 15980 1727204152.34176: dumping result to json 15980 1727204152.34178: done dumping result, returning 15980 1727204152.34186: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [127b8e07-fff9-5f1d-4b72-000000000019] 15980 1727204152.34191: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000019 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15980 1727204152.34523: no more pending results, returning what we have 15980 1727204152.34529: results queue empty 15980 1727204152.34530: checking for any_errors_fatal 15980 1727204152.34537: done checking for any_errors_fatal 15980 1727204152.34539: checking for max_fail_percentage 15980 1727204152.34540: done checking for max_fail_percentage 15980 1727204152.34541: checking to see if all hosts have failed and the running result is not ok 15980 1727204152.34542: done checking to see if all hosts have failed 15980 1727204152.34543: getting the remaining hosts for this loop 15980 1727204152.34545: done getting the remaining hosts for this loop 15980 1727204152.34549: getting the next task for host managed-node2 15980 1727204152.34555: done getting next task for host managed-node2 15980 1727204152.34560: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15980 1727204152.34562: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204152.34578: getting variables 15980 1727204152.34580: in VariableManager get_vars() 15980 1727204152.34824: Calling all_inventory to load vars for managed-node2 15980 1727204152.34831: Calling groups_inventory to load vars for managed-node2 15980 1727204152.34834: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204152.34846: Calling all_plugins_play to load vars for managed-node2 15980 1727204152.34849: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204152.34852: Calling groups_plugins_play to load vars for managed-node2 15980 1727204152.35474: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000019 15980 1727204152.35478: WORKER PROCESS EXITING 15980 1727204152.40745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204152.45984: done with get_vars() 15980 1727204152.46032: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15980 1727204152.46337: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:55:52 -0400 (0:00:00.199) 0:00:13.874 ***** 15980 1727204152.46377: entering _queue_task() for managed-node2/yum 15980 1727204152.46379: Creating lock for yum 15980 1727204152.47174: worker is 1 (out of 1 available) 15980 1727204152.47190: exiting _queue_task() for managed-node2/yum 15980 1727204152.47205: done queuing things up, now waiting for results queue to drain 15980 1727204152.47207: waiting for pending results... 15980 1727204152.47667: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15980 1727204152.48173: in run() - task 127b8e07-fff9-5f1d-4b72-00000000001a 15980 1727204152.48178: variable 'ansible_search_path' from source: unknown 15980 1727204152.48180: variable 'ansible_search_path' from source: unknown 15980 1727204152.48183: calling self._execute() 15980 1727204152.48186: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204152.48190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204152.48193: variable 'omit' from source: magic vars 15980 1727204152.49255: variable 'ansible_distribution_major_version' from source: facts 15980 1727204152.49281: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204152.49886: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15980 1727204152.56275: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15980 1727204152.56560: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15980 1727204152.57027: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15980 1727204152.57031: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15980 1727204152.57117: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15980 1727204152.57377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204152.57418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204152.57491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204152.57616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204152.57674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204152.57988: variable 'ansible_distribution_major_version' from source: facts 15980 1727204152.57992: Evaluated conditional (ansible_distribution_major_version | int < 8): False 15980 1727204152.57995: when evaluation is False, skipping this task 15980 1727204152.57997: _execute() done 15980 1727204152.58003: dumping result to json 15980 1727204152.58006: done dumping result, returning 15980 1727204152.58011: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [127b8e07-fff9-5f1d-4b72-00000000001a] 15980 1727204152.58014: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000001a skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 15980 1727204152.58163: no more pending results, returning what we have 15980 1727204152.58173: results queue empty 15980 1727204152.58174: checking for any_errors_fatal 15980 1727204152.58180: done checking for any_errors_fatal 15980 1727204152.58181: checking for max_fail_percentage 15980 1727204152.58182: done checking for max_fail_percentage 15980 1727204152.58183: checking to see if all hosts have failed and the running result is not ok 15980 1727204152.58184: done checking to see if all hosts have failed 15980 1727204152.58185: getting the remaining hosts for this loop 15980 1727204152.58187: done getting the remaining hosts for this loop 15980 1727204152.58192: getting the next task for host managed-node2 15980 1727204152.58200: done getting next task for host managed-node2 15980 1727204152.58205: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15980 1727204152.58207: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204152.58229: getting variables 15980 1727204152.58231: in VariableManager get_vars() 15980 1727204152.58283: Calling all_inventory to load vars for managed-node2 15980 1727204152.58287: Calling groups_inventory to load vars for managed-node2 15980 1727204152.58290: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204152.58379: Calling all_plugins_play to load vars for managed-node2 15980 1727204152.58383: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204152.58393: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000001a 15980 1727204152.58396: WORKER PROCESS EXITING 15980 1727204152.58400: Calling groups_plugins_play to load vars for managed-node2 15980 1727204152.61689: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204152.64057: done with get_vars() 15980 1727204152.64099: done getting variables 15980 1727204152.64191: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:55:52 -0400 (0:00:00.178) 0:00:14.052 ***** 15980 1727204152.64234: entering _queue_task() for managed-node2/fail 15980 1727204152.64633: worker is 1 (out of 1 available) 15980 1727204152.64762: exiting _queue_task() for managed-node2/fail 15980 1727204152.64778: done queuing things up, now waiting for results queue to drain 15980 1727204152.64780: waiting for pending results... 15980 1727204152.65004: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15980 1727204152.65135: in run() - task 127b8e07-fff9-5f1d-4b72-00000000001b 15980 1727204152.65159: variable 'ansible_search_path' from source: unknown 15980 1727204152.65170: variable 'ansible_search_path' from source: unknown 15980 1727204152.65228: calling self._execute() 15980 1727204152.65348: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204152.65363: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204152.65382: variable 'omit' from source: magic vars 15980 1727204152.65871: variable 'ansible_distribution_major_version' from source: facts 15980 1727204152.65909: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204152.66058: variable '__network_wireless_connections_defined' from source: role '' defaults 15980 1727204152.66406: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15980 1727204152.69253: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15980 1727204152.69346: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15980 1727204152.69395: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15980 1727204152.69451: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15980 1727204152.69485: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15980 1727204152.69591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204152.69632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204152.69672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204152.69721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204152.69748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204152.69872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204152.69876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204152.69879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204152.69928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204152.69947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204152.70005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204152.70037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204152.70070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204152.70124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204152.70201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204152.70367: variable 'network_connections' from source: play vars 15980 1727204152.70386: variable 'interface' from source: set_fact 15980 1727204152.70485: variable 'interface' from source: set_fact 15980 1727204152.70504: variable 'interface' from source: set_fact 15980 1727204152.70595: variable 'interface' from source: set_fact 15980 1727204152.70690: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15980 1727204152.70943: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15980 1727204152.70997: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15980 1727204152.71070: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15980 1727204152.71073: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15980 1727204152.71133: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15980 1727204152.71163: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15980 1727204152.71202: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204152.71239: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15980 1727204152.71317: variable '__network_team_connections_defined' from source: role '' defaults 15980 1727204152.71637: variable 'network_connections' from source: play vars 15980 1727204152.71641: variable 'interface' from source: set_fact 15980 1727204152.71714: variable 'interface' from source: set_fact 15980 1727204152.71745: variable 'interface' from source: set_fact 15980 1727204152.71804: variable 'interface' from source: set_fact 15980 1727204152.71854: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15980 1727204152.71962: when evaluation is False, skipping this task 15980 1727204152.71967: _execute() done 15980 1727204152.71970: dumping result to json 15980 1727204152.71972: done dumping result, returning 15980 1727204152.71974: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-5f1d-4b72-00000000001b] 15980 1727204152.71985: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000001b 15980 1727204152.72060: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000001b 15980 1727204152.72065: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15980 1727204152.72122: no more pending results, returning what we have 15980 1727204152.72128: results queue empty 15980 1727204152.72129: checking for any_errors_fatal 15980 1727204152.72135: done checking for any_errors_fatal 15980 1727204152.72135: checking for max_fail_percentage 15980 1727204152.72137: done checking for max_fail_percentage 15980 1727204152.72138: checking to see if all hosts have failed and the running result is not ok 15980 1727204152.72139: done checking to see if all hosts have failed 15980 1727204152.72140: getting the remaining hosts for this loop 15980 1727204152.72142: done getting the remaining hosts for this loop 15980 1727204152.72146: getting the next task for host managed-node2 15980 1727204152.72153: done getting next task for host managed-node2 15980 1727204152.72157: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 15980 1727204152.72159: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204152.72180: getting variables 15980 1727204152.72182: in VariableManager get_vars() 15980 1727204152.72228: Calling all_inventory to load vars for managed-node2 15980 1727204152.72232: Calling groups_inventory to load vars for managed-node2 15980 1727204152.72235: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204152.72248: Calling all_plugins_play to load vars for managed-node2 15980 1727204152.72252: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204152.72256: Calling groups_plugins_play to load vars for managed-node2 15980 1727204152.82581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204152.85795: done with get_vars() 15980 1727204152.86161: done getting variables 15980 1727204152.86267: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:55:52 -0400 (0:00:00.220) 0:00:14.273 ***** 15980 1727204152.86308: entering _queue_task() for managed-node2/package 15980 1727204152.87021: worker is 1 (out of 1 available) 15980 1727204152.87045: exiting _queue_task() for managed-node2/package 15980 1727204152.87381: done queuing things up, now waiting for results queue to drain 15980 1727204152.87385: waiting for pending results... 15980 1727204152.88064: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 15980 1727204152.88078: in run() - task 127b8e07-fff9-5f1d-4b72-00000000001c 15980 1727204152.88288: variable 'ansible_search_path' from source: unknown 15980 1727204152.88295: variable 'ansible_search_path' from source: unknown 15980 1727204152.88464: calling self._execute() 15980 1727204152.88710: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204152.88843: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204152.88910: variable 'omit' from source: magic vars 15980 1727204152.90157: variable 'ansible_distribution_major_version' from source: facts 15980 1727204152.90162: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204152.90514: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15980 1727204152.91004: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15980 1727204152.91074: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15980 1727204152.91237: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15980 1727204152.91241: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15980 1727204152.91382: variable 'network_packages' from source: role '' defaults 15980 1727204152.91540: variable '__network_provider_setup' from source: role '' defaults 15980 1727204152.91568: variable '__network_service_name_default_nm' from source: role '' defaults 15980 1727204152.91668: variable '__network_service_name_default_nm' from source: role '' defaults 15980 1727204152.91690: variable '__network_packages_default_nm' from source: role '' defaults 15980 1727204152.91768: variable '__network_packages_default_nm' from source: role '' defaults 15980 1727204152.91992: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15980 1727204152.95460: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15980 1727204152.95750: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15980 1727204152.95808: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15980 1727204152.95883: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15980 1727204152.95989: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15980 1727204152.96213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204152.96262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204152.96362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204152.96486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204152.96564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204152.96605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204152.96809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204152.96813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204152.96831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204152.96858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204152.97200: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15980 1727204152.97350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204152.97386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204152.97450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204152.97474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204152.97493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204152.97601: variable 'ansible_python' from source: facts 15980 1727204152.97682: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15980 1727204152.97970: variable '__network_wpa_supplicant_required' from source: role '' defaults 15980 1727204152.97974: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15980 1727204152.98006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204152.98036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204152.98064: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204152.98119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204152.98141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204152.98201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204152.98344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204152.98418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204152.98559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204152.98653: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204152.98877: variable 'network_connections' from source: play vars 15980 1727204152.98891: variable 'interface' from source: set_fact 15980 1727204152.99088: variable 'interface' from source: set_fact 15980 1727204152.99091: variable 'interface' from source: set_fact 15980 1727204152.99210: variable 'interface' from source: set_fact 15980 1727204152.99310: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15980 1727204152.99344: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15980 1727204152.99385: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204152.99434: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15980 1727204152.99493: variable '__network_wireless_connections_defined' from source: role '' defaults 15980 1727204152.99875: variable 'network_connections' from source: play vars 15980 1727204152.99887: variable 'interface' from source: set_fact 15980 1727204153.00043: variable 'interface' from source: set_fact 15980 1727204153.00051: variable 'interface' from source: set_fact 15980 1727204153.00302: variable 'interface' from source: set_fact 15980 1727204153.00388: variable '__network_packages_default_wireless' from source: role '' defaults 15980 1727204153.00625: variable '__network_wireless_connections_defined' from source: role '' defaults 15980 1727204153.00985: variable 'network_connections' from source: play vars 15980 1727204153.00997: variable 'interface' from source: set_fact 15980 1727204153.01080: variable 'interface' from source: set_fact 15980 1727204153.01180: variable 'interface' from source: set_fact 15980 1727204153.01228: variable 'interface' from source: set_fact 15980 1727204153.01315: variable '__network_packages_default_team' from source: role '' defaults 15980 1727204153.01530: variable '__network_team_connections_defined' from source: role '' defaults 15980 1727204153.02074: variable 'network_connections' from source: play vars 15980 1727204153.02084: variable 'interface' from source: set_fact 15980 1727204153.02219: variable 'interface' from source: set_fact 15980 1727204153.02262: variable 'interface' from source: set_fact 15980 1727204153.02491: variable 'interface' from source: set_fact 15980 1727204153.02664: variable '__network_service_name_default_initscripts' from source: role '' defaults 15980 1727204153.02877: variable '__network_service_name_default_initscripts' from source: role '' defaults 15980 1727204153.02895: variable '__network_packages_default_initscripts' from source: role '' defaults 15980 1727204153.03145: variable '__network_packages_default_initscripts' from source: role '' defaults 15980 1727204153.03481: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15980 1727204153.04443: variable 'network_connections' from source: play vars 15980 1727204153.04459: variable 'interface' from source: set_fact 15980 1727204153.04781: variable 'interface' from source: set_fact 15980 1727204153.04832: variable 'interface' from source: set_fact 15980 1727204153.05131: variable 'interface' from source: set_fact 15980 1727204153.05135: variable 'ansible_distribution' from source: facts 15980 1727204153.05137: variable '__network_rh_distros' from source: role '' defaults 15980 1727204153.05139: variable 'ansible_distribution_major_version' from source: facts 15980 1727204153.05147: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15980 1727204153.05597: variable 'ansible_distribution' from source: facts 15980 1727204153.05609: variable '__network_rh_distros' from source: role '' defaults 15980 1727204153.05622: variable 'ansible_distribution_major_version' from source: facts 15980 1727204153.05638: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15980 1727204153.06175: variable 'ansible_distribution' from source: facts 15980 1727204153.06237: variable '__network_rh_distros' from source: role '' defaults 15980 1727204153.06250: variable 'ansible_distribution_major_version' from source: facts 15980 1727204153.06622: variable 'network_provider' from source: set_fact 15980 1727204153.06838: variable 'ansible_facts' from source: unknown 15980 1727204153.08123: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 15980 1727204153.08139: when evaluation is False, skipping this task 15980 1727204153.08148: _execute() done 15980 1727204153.08155: dumping result to json 15980 1727204153.08198: done dumping result, returning 15980 1727204153.08213: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [127b8e07-fff9-5f1d-4b72-00000000001c] 15980 1727204153.08224: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000001c skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 15980 1727204153.08401: no more pending results, returning what we have 15980 1727204153.08406: results queue empty 15980 1727204153.08409: checking for any_errors_fatal 15980 1727204153.08416: done checking for any_errors_fatal 15980 1727204153.08417: checking for max_fail_percentage 15980 1727204153.08419: done checking for max_fail_percentage 15980 1727204153.08420: checking to see if all hosts have failed and the running result is not ok 15980 1727204153.08421: done checking to see if all hosts have failed 15980 1727204153.08421: getting the remaining hosts for this loop 15980 1727204153.08423: done getting the remaining hosts for this loop 15980 1727204153.08428: getting the next task for host managed-node2 15980 1727204153.08435: done getting next task for host managed-node2 15980 1727204153.08439: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15980 1727204153.08442: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204153.08457: getting variables 15980 1727204153.08459: in VariableManager get_vars() 15980 1727204153.08505: Calling all_inventory to load vars for managed-node2 15980 1727204153.08509: Calling groups_inventory to load vars for managed-node2 15980 1727204153.08511: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204153.08524: Calling all_plugins_play to load vars for managed-node2 15980 1727204153.08533: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204153.08537: Calling groups_plugins_play to load vars for managed-node2 15980 1727204153.09467: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000001c 15980 1727204153.09473: WORKER PROCESS EXITING 15980 1727204153.11622: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204153.14152: done with get_vars() 15980 1727204153.14190: done getting variables 15980 1727204153.14260: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:55:53 -0400 (0:00:00.279) 0:00:14.553 ***** 15980 1727204153.14298: entering _queue_task() for managed-node2/package 15980 1727204153.14778: worker is 1 (out of 1 available) 15980 1727204153.14792: exiting _queue_task() for managed-node2/package 15980 1727204153.14809: done queuing things up, now waiting for results queue to drain 15980 1727204153.14811: waiting for pending results... 15980 1727204153.15440: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15980 1727204153.15775: in run() - task 127b8e07-fff9-5f1d-4b72-00000000001d 15980 1727204153.15780: variable 'ansible_search_path' from source: unknown 15980 1727204153.15782: variable 'ansible_search_path' from source: unknown 15980 1727204153.15786: calling self._execute() 15980 1727204153.15789: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204153.15791: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204153.15794: variable 'omit' from source: magic vars 15980 1727204153.16131: variable 'ansible_distribution_major_version' from source: facts 15980 1727204153.16140: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204153.16284: variable 'network_state' from source: role '' defaults 15980 1727204153.16294: Evaluated conditional (network_state != {}): False 15980 1727204153.16297: when evaluation is False, skipping this task 15980 1727204153.16300: _execute() done 15980 1727204153.16304: dumping result to json 15980 1727204153.16306: done dumping result, returning 15980 1727204153.16315: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [127b8e07-fff9-5f1d-4b72-00000000001d] 15980 1727204153.16321: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000001d 15980 1727204153.16433: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000001d 15980 1727204153.16437: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15980 1727204153.16494: no more pending results, returning what we have 15980 1727204153.16499: results queue empty 15980 1727204153.16499: checking for any_errors_fatal 15980 1727204153.16508: done checking for any_errors_fatal 15980 1727204153.16509: checking for max_fail_percentage 15980 1727204153.16511: done checking for max_fail_percentage 15980 1727204153.16512: checking to see if all hosts have failed and the running result is not ok 15980 1727204153.16513: done checking to see if all hosts have failed 15980 1727204153.16514: getting the remaining hosts for this loop 15980 1727204153.16516: done getting the remaining hosts for this loop 15980 1727204153.16520: getting the next task for host managed-node2 15980 1727204153.16528: done getting next task for host managed-node2 15980 1727204153.16532: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15980 1727204153.16534: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204153.16550: getting variables 15980 1727204153.16553: in VariableManager get_vars() 15980 1727204153.16597: Calling all_inventory to load vars for managed-node2 15980 1727204153.16600: Calling groups_inventory to load vars for managed-node2 15980 1727204153.16602: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204153.16615: Calling all_plugins_play to load vars for managed-node2 15980 1727204153.16618: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204153.16621: Calling groups_plugins_play to load vars for managed-node2 15980 1727204153.18717: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204153.20934: done with get_vars() 15980 1727204153.20973: done getting variables 15980 1727204153.21039: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:55:53 -0400 (0:00:00.067) 0:00:14.621 ***** 15980 1727204153.21075: entering _queue_task() for managed-node2/package 15980 1727204153.21453: worker is 1 (out of 1 available) 15980 1727204153.21469: exiting _queue_task() for managed-node2/package 15980 1727204153.21483: done queuing things up, now waiting for results queue to drain 15980 1727204153.21485: waiting for pending results... 15980 1727204153.22190: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15980 1727204153.22197: in run() - task 127b8e07-fff9-5f1d-4b72-00000000001e 15980 1727204153.22322: variable 'ansible_search_path' from source: unknown 15980 1727204153.22329: variable 'ansible_search_path' from source: unknown 15980 1727204153.22373: calling self._execute() 15980 1727204153.22594: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204153.22599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204153.22614: variable 'omit' from source: magic vars 15980 1727204153.23581: variable 'ansible_distribution_major_version' from source: facts 15980 1727204153.23595: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204153.23856: variable 'network_state' from source: role '' defaults 15980 1727204153.23870: Evaluated conditional (network_state != {}): False 15980 1727204153.23949: when evaluation is False, skipping this task 15980 1727204153.23953: _execute() done 15980 1727204153.23956: dumping result to json 15980 1727204153.23959: done dumping result, returning 15980 1727204153.23968: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [127b8e07-fff9-5f1d-4b72-00000000001e] 15980 1727204153.23973: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000001e skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15980 1727204153.24142: no more pending results, returning what we have 15980 1727204153.24147: results queue empty 15980 1727204153.24148: checking for any_errors_fatal 15980 1727204153.24161: done checking for any_errors_fatal 15980 1727204153.24162: checking for max_fail_percentage 15980 1727204153.24165: done checking for max_fail_percentage 15980 1727204153.24168: checking to see if all hosts have failed and the running result is not ok 15980 1727204153.24169: done checking to see if all hosts have failed 15980 1727204153.24170: getting the remaining hosts for this loop 15980 1727204153.24171: done getting the remaining hosts for this loop 15980 1727204153.24176: getting the next task for host managed-node2 15980 1727204153.24183: done getting next task for host managed-node2 15980 1727204153.24188: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15980 1727204153.24191: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204153.24209: getting variables 15980 1727204153.24211: in VariableManager get_vars() 15980 1727204153.24259: Calling all_inventory to load vars for managed-node2 15980 1727204153.24263: Calling groups_inventory to load vars for managed-node2 15980 1727204153.24568: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204153.24578: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000001e 15980 1727204153.24584: WORKER PROCESS EXITING 15980 1727204153.24600: Calling all_plugins_play to load vars for managed-node2 15980 1727204153.24604: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204153.24608: Calling groups_plugins_play to load vars for managed-node2 15980 1727204153.27168: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204153.29321: done with get_vars() 15980 1727204153.29360: done getting variables 15980 1727204153.29483: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:55:53 -0400 (0:00:00.084) 0:00:14.705 ***** 15980 1727204153.29519: entering _queue_task() for managed-node2/service 15980 1727204153.29521: Creating lock for service 15980 1727204153.29911: worker is 1 (out of 1 available) 15980 1727204153.29925: exiting _queue_task() for managed-node2/service 15980 1727204153.29938: done queuing things up, now waiting for results queue to drain 15980 1727204153.29941: waiting for pending results... 15980 1727204153.30256: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15980 1727204153.30368: in run() - task 127b8e07-fff9-5f1d-4b72-00000000001f 15980 1727204153.30384: variable 'ansible_search_path' from source: unknown 15980 1727204153.30387: variable 'ansible_search_path' from source: unknown 15980 1727204153.30435: calling self._execute() 15980 1727204153.30544: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204153.30548: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204153.30562: variable 'omit' from source: magic vars 15980 1727204153.30991: variable 'ansible_distribution_major_version' from source: facts 15980 1727204153.31005: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204153.31135: variable '__network_wireless_connections_defined' from source: role '' defaults 15980 1727204153.31357: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15980 1727204153.33839: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15980 1727204153.33882: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15980 1727204153.33931: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15980 1727204153.33968: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15980 1727204153.34057: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15980 1727204153.34153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204153.34157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204153.34161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204153.34272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204153.34276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204153.34278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204153.34307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204153.34344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204153.34390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204153.34406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204153.34455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204153.34486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204153.34514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204153.34561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204153.34582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204153.34785: variable 'network_connections' from source: play vars 15980 1727204153.34870: variable 'interface' from source: set_fact 15980 1727204153.34892: variable 'interface' from source: set_fact 15980 1727204153.34905: variable 'interface' from source: set_fact 15980 1727204153.34976: variable 'interface' from source: set_fact 15980 1727204153.35062: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15980 1727204153.35279: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15980 1727204153.35325: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15980 1727204153.35418: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15980 1727204153.35421: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15980 1727204153.35441: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15980 1727204153.35465: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15980 1727204153.35495: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204153.35536: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15980 1727204153.35615: variable '__network_team_connections_defined' from source: role '' defaults 15980 1727204153.35910: variable 'network_connections' from source: play vars 15980 1727204153.35922: variable 'interface' from source: set_fact 15980 1727204153.36001: variable 'interface' from source: set_fact 15980 1727204153.36015: variable 'interface' from source: set_fact 15980 1727204153.36087: variable 'interface' from source: set_fact 15980 1727204153.36183: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15980 1727204153.36187: when evaluation is False, skipping this task 15980 1727204153.36189: _execute() done 15980 1727204153.36192: dumping result to json 15980 1727204153.36194: done dumping result, returning 15980 1727204153.36196: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-5f1d-4b72-00000000001f] 15980 1727204153.36209: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000001f skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15980 1727204153.36520: no more pending results, returning what we have 15980 1727204153.36524: results queue empty 15980 1727204153.36525: checking for any_errors_fatal 15980 1727204153.36535: done checking for any_errors_fatal 15980 1727204153.36536: checking for max_fail_percentage 15980 1727204153.36538: done checking for max_fail_percentage 15980 1727204153.36539: checking to see if all hosts have failed and the running result is not ok 15980 1727204153.36540: done checking to see if all hosts have failed 15980 1727204153.36542: getting the remaining hosts for this loop 15980 1727204153.36544: done getting the remaining hosts for this loop 15980 1727204153.36549: getting the next task for host managed-node2 15980 1727204153.36555: done getting next task for host managed-node2 15980 1727204153.36560: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15980 1727204153.36562: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204153.36581: getting variables 15980 1727204153.36583: in VariableManager get_vars() 15980 1727204153.36627: Calling all_inventory to load vars for managed-node2 15980 1727204153.36630: Calling groups_inventory to load vars for managed-node2 15980 1727204153.36633: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204153.36645: Calling all_plugins_play to load vars for managed-node2 15980 1727204153.36649: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204153.36653: Calling groups_plugins_play to load vars for managed-node2 15980 1727204153.37234: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000001f 15980 1727204153.37238: WORKER PROCESS EXITING 15980 1727204153.38524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204153.40629: done with get_vars() 15980 1727204153.40669: done getting variables 15980 1727204153.40739: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:55:53 -0400 (0:00:00.112) 0:00:14.818 ***** 15980 1727204153.40775: entering _queue_task() for managed-node2/service 15980 1727204153.41145: worker is 1 (out of 1 available) 15980 1727204153.41160: exiting _queue_task() for managed-node2/service 15980 1727204153.41175: done queuing things up, now waiting for results queue to drain 15980 1727204153.41177: waiting for pending results... 15980 1727204153.41488: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15980 1727204153.41610: in run() - task 127b8e07-fff9-5f1d-4b72-000000000020 15980 1727204153.41633: variable 'ansible_search_path' from source: unknown 15980 1727204153.41641: variable 'ansible_search_path' from source: unknown 15980 1727204153.41689: calling self._execute() 15980 1727204153.41791: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204153.41804: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204153.41825: variable 'omit' from source: magic vars 15980 1727204153.42258: variable 'ansible_distribution_major_version' from source: facts 15980 1727204153.42281: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204153.42471: variable 'network_provider' from source: set_fact 15980 1727204153.42501: variable 'network_state' from source: role '' defaults 15980 1727204153.42517: Evaluated conditional (network_provider == "nm" or network_state != {}): True 15980 1727204153.42528: variable 'omit' from source: magic vars 15980 1727204153.42569: variable 'omit' from source: magic vars 15980 1727204153.42608: variable 'network_service_name' from source: role '' defaults 15980 1727204153.42771: variable 'network_service_name' from source: role '' defaults 15980 1727204153.42831: variable '__network_provider_setup' from source: role '' defaults 15980 1727204153.42845: variable '__network_service_name_default_nm' from source: role '' defaults 15980 1727204153.42924: variable '__network_service_name_default_nm' from source: role '' defaults 15980 1727204153.42938: variable '__network_packages_default_nm' from source: role '' defaults 15980 1727204153.43010: variable '__network_packages_default_nm' from source: role '' defaults 15980 1727204153.43276: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15980 1727204153.46047: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15980 1727204153.46110: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15980 1727204153.46139: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15980 1727204153.46167: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15980 1727204153.46188: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15980 1727204153.46256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204153.46279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204153.46299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204153.46331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204153.46341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204153.46381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204153.46399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204153.46418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204153.46448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204153.46458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204153.46620: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15980 1727204153.46712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204153.46731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204153.46752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204153.46781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204153.46792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204153.46864: variable 'ansible_python' from source: facts 15980 1727204153.46885: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15980 1727204153.46950: variable '__network_wpa_supplicant_required' from source: role '' defaults 15980 1727204153.47010: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15980 1727204153.47103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204153.47121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204153.47141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204153.47171: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204153.47183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204153.47221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204153.47243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204153.47262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204153.47293: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204153.47304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204153.47405: variable 'network_connections' from source: play vars 15980 1727204153.47412: variable 'interface' from source: set_fact 15980 1727204153.47486: variable 'interface' from source: set_fact 15980 1727204153.47497: variable 'interface' from source: set_fact 15980 1727204153.47692: variable 'interface' from source: set_fact 15980 1727204153.47695: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15980 1727204153.47893: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15980 1727204153.47962: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15980 1727204153.47990: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15980 1727204153.48070: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15980 1727204153.48103: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15980 1727204153.48133: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15980 1727204153.48168: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204153.48200: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15980 1727204153.48271: variable '__network_wireless_connections_defined' from source: role '' defaults 15980 1727204153.48549: variable 'network_connections' from source: play vars 15980 1727204153.48556: variable 'interface' from source: set_fact 15980 1727204153.48651: variable 'interface' from source: set_fact 15980 1727204153.48654: variable 'interface' from source: set_fact 15980 1727204153.48721: variable 'interface' from source: set_fact 15980 1727204153.48777: variable '__network_packages_default_wireless' from source: role '' defaults 15980 1727204153.48867: variable '__network_wireless_connections_defined' from source: role '' defaults 15980 1727204153.49078: variable 'network_connections' from source: play vars 15980 1727204153.49082: variable 'interface' from source: set_fact 15980 1727204153.49135: variable 'interface' from source: set_fact 15980 1727204153.49140: variable 'interface' from source: set_fact 15980 1727204153.49195: variable 'interface' from source: set_fact 15980 1727204153.49219: variable '__network_packages_default_team' from source: role '' defaults 15980 1727204153.49279: variable '__network_team_connections_defined' from source: role '' defaults 15980 1727204153.49486: variable 'network_connections' from source: play vars 15980 1727204153.49490: variable 'interface' from source: set_fact 15980 1727204153.49545: variable 'interface' from source: set_fact 15980 1727204153.49550: variable 'interface' from source: set_fact 15980 1727204153.49605: variable 'interface' from source: set_fact 15980 1727204153.49657: variable '__network_service_name_default_initscripts' from source: role '' defaults 15980 1727204153.49704: variable '__network_service_name_default_initscripts' from source: role '' defaults 15980 1727204153.49710: variable '__network_packages_default_initscripts' from source: role '' defaults 15980 1727204153.49767: variable '__network_packages_default_initscripts' from source: role '' defaults 15980 1727204153.49916: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15980 1727204153.50264: variable 'network_connections' from source: play vars 15980 1727204153.50270: variable 'interface' from source: set_fact 15980 1727204153.50318: variable 'interface' from source: set_fact 15980 1727204153.50324: variable 'interface' from source: set_fact 15980 1727204153.50370: variable 'interface' from source: set_fact 15980 1727204153.50379: variable 'ansible_distribution' from source: facts 15980 1727204153.50382: variable '__network_rh_distros' from source: role '' defaults 15980 1727204153.50388: variable 'ansible_distribution_major_version' from source: facts 15980 1727204153.50409: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15980 1727204153.50552: variable 'ansible_distribution' from source: facts 15980 1727204153.50555: variable '__network_rh_distros' from source: role '' defaults 15980 1727204153.50560: variable 'ansible_distribution_major_version' from source: facts 15980 1727204153.50567: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15980 1727204153.50788: variable 'ansible_distribution' from source: facts 15980 1727204153.50792: variable '__network_rh_distros' from source: role '' defaults 15980 1727204153.50794: variable 'ansible_distribution_major_version' from source: facts 15980 1727204153.50796: variable 'network_provider' from source: set_fact 15980 1727204153.50798: variable 'omit' from source: magic vars 15980 1727204153.50868: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204153.50872: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204153.50874: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204153.50884: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204153.51003: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204153.51007: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204153.51010: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204153.51012: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204153.51083: Set connection var ansible_connection to ssh 15980 1727204153.51086: Set connection var ansible_pipelining to False 15980 1727204153.51088: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204153.51090: Set connection var ansible_timeout to 10 15980 1727204153.51092: Set connection var ansible_shell_type to sh 15980 1727204153.51094: Set connection var ansible_shell_executable to /bin/sh 15980 1727204153.51256: variable 'ansible_shell_executable' from source: unknown 15980 1727204153.51260: variable 'ansible_connection' from source: unknown 15980 1727204153.51264: variable 'ansible_module_compression' from source: unknown 15980 1727204153.51268: variable 'ansible_shell_type' from source: unknown 15980 1727204153.51271: variable 'ansible_shell_executable' from source: unknown 15980 1727204153.51273: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204153.51280: variable 'ansible_pipelining' from source: unknown 15980 1727204153.51282: variable 'ansible_timeout' from source: unknown 15980 1727204153.51285: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204153.51496: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15980 1727204153.51500: variable 'omit' from source: magic vars 15980 1727204153.51502: starting attempt loop 15980 1727204153.51504: running the handler 15980 1727204153.51506: variable 'ansible_facts' from source: unknown 15980 1727204153.53047: _low_level_execute_command(): starting 15980 1727204153.53053: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15980 1727204153.54370: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204153.54382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204153.54683: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204153.54925: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204153.56655: stdout chunk (state=3): >>>/root <<< 15980 1727204153.57101: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204153.57105: stdout chunk (state=3): >>><<< 15980 1727204153.57108: stderr chunk (state=3): >>><<< 15980 1727204153.57111: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204153.57113: _low_level_execute_command(): starting 15980 1727204153.57117: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204153.5699418-17561-159260864021006 `" && echo ansible-tmp-1727204153.5699418-17561-159260864021006="` echo /root/.ansible/tmp/ansible-tmp-1727204153.5699418-17561-159260864021006 `" ) && sleep 0' 15980 1727204153.57810: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204153.57882: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204153.57948: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204153.60010: stdout chunk (state=3): >>>ansible-tmp-1727204153.5699418-17561-159260864021006=/root/.ansible/tmp/ansible-tmp-1727204153.5699418-17561-159260864021006 <<< 15980 1727204153.60475: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204153.60480: stderr chunk (state=3): >>><<< 15980 1727204153.60483: stdout chunk (state=3): >>><<< 15980 1727204153.60486: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204153.5699418-17561-159260864021006=/root/.ansible/tmp/ansible-tmp-1727204153.5699418-17561-159260864021006 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204153.60489: variable 'ansible_module_compression' from source: unknown 15980 1727204153.60493: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 15980 1727204153.60496: ANSIBALLZ: Acquiring lock 15980 1727204153.60498: ANSIBALLZ: Lock acquired: 139981197612416 15980 1727204153.60500: ANSIBALLZ: Creating module 15980 1727204154.05087: ANSIBALLZ: Writing module into payload 15980 1727204154.05270: ANSIBALLZ: Writing module 15980 1727204154.05307: ANSIBALLZ: Renaming module 15980 1727204154.05313: ANSIBALLZ: Done creating module 15980 1727204154.05344: variable 'ansible_facts' from source: unknown 15980 1727204154.05517: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204153.5699418-17561-159260864021006/AnsiballZ_systemd.py 15980 1727204154.05686: Sending initial data 15980 1727204154.05689: Sent initial data (156 bytes) 15980 1727204154.06653: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204154.06673: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204154.06677: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204154.06921: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204154.07023: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204154.08844: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15980 1727204154.08900: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15980 1727204154.09072: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15980vtkmnsww/tmpogrvw471 /root/.ansible/tmp/ansible-tmp-1727204153.5699418-17561-159260864021006/AnsiballZ_systemd.py <<< 15980 1727204154.09086: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204153.5699418-17561-159260864021006/AnsiballZ_systemd.py" <<< 15980 1727204154.09135: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15980vtkmnsww/tmpogrvw471" to remote "/root/.ansible/tmp/ansible-tmp-1727204153.5699418-17561-159260864021006/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204153.5699418-17561-159260864021006/AnsiballZ_systemd.py" <<< 15980 1727204154.12357: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204154.12674: stderr chunk (state=3): >>><<< 15980 1727204154.12680: stdout chunk (state=3): >>><<< 15980 1727204154.12682: done transferring module to remote 15980 1727204154.12684: _low_level_execute_command(): starting 15980 1727204154.12687: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204153.5699418-17561-159260864021006/ /root/.ansible/tmp/ansible-tmp-1727204153.5699418-17561-159260864021006/AnsiballZ_systemd.py && sleep 0' 15980 1727204154.13187: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204154.13218: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204154.13231: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204154.13258: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204154.13356: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204154.15302: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204154.15320: stdout chunk (state=3): >>><<< 15980 1727204154.15343: stderr chunk (state=3): >>><<< 15980 1727204154.15474: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204154.15477: _low_level_execute_command(): starting 15980 1727204154.15480: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204153.5699418-17561-159260864021006/AnsiballZ_systemd.py && sleep 0' 15980 1727204154.16253: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204154.16285: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204154.16299: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204154.16351: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204154.16459: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204154.49008: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3396", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ExecMainStartTimestampMonotonic": "260891109", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3396", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5305", "MemoryCurrent": "4460544", "MemoryPeak": "6385664", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3521306624", "CPUUsageNSec": "748339000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target multi-user.target network.service shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service dbus.socket sysinit.target basic.target cloud-init-local.service system.slice", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:55 EDT", "StateChangeTimestampMonotonic": "382184288", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveExitTimestampMonotonic": "260891359", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveEnterTimestampMonotonic": "260977925", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveExitTimestampMonotonic": "260855945", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveEnterTimestampMonotonic": "260882383", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ConditionTimestampMonotonic": "260884363", "AssertTimestamp": "Tue 2024-09-24 14:51:54 EDT", "AssertTimestampMonotonic": "260884375", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "16e01f988f204339aa5cb89910a771d1", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 15980 1727204154.51178: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 15980 1727204154.51183: stdout chunk (state=3): >>><<< 15980 1727204154.51186: stderr chunk (state=3): >>><<< 15980 1727204154.51189: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3396", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ExecMainStartTimestampMonotonic": "260891109", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3396", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5305", "MemoryCurrent": "4460544", "MemoryPeak": "6385664", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3521306624", "CPUUsageNSec": "748339000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target multi-user.target network.service shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service dbus.socket sysinit.target basic.target cloud-init-local.service system.slice", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:55 EDT", "StateChangeTimestampMonotonic": "382184288", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveExitTimestampMonotonic": "260891359", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveEnterTimestampMonotonic": "260977925", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveExitTimestampMonotonic": "260855945", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveEnterTimestampMonotonic": "260882383", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ConditionTimestampMonotonic": "260884363", "AssertTimestamp": "Tue 2024-09-24 14:51:54 EDT", "AssertTimestampMonotonic": "260884375", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "16e01f988f204339aa5cb89910a771d1", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 15980 1727204154.51552: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204153.5699418-17561-159260864021006/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15980 1727204154.51597: _low_level_execute_command(): starting 15980 1727204154.51771: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204153.5699418-17561-159260864021006/ > /dev/null 2>&1 && sleep 0' 15980 1727204154.53054: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204154.53081: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15980 1727204154.53176: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204154.53384: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204154.53490: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204154.55454: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204154.55742: stderr chunk (state=3): >>><<< 15980 1727204154.55746: stdout chunk (state=3): >>><<< 15980 1727204154.55748: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204154.55751: handler run complete 15980 1727204154.55791: attempt loop complete, returning result 15980 1727204154.55794: _execute() done 15980 1727204154.55797: dumping result to json 15980 1727204154.55816: done dumping result, returning 15980 1727204154.55827: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [127b8e07-fff9-5f1d-4b72-000000000020] 15980 1727204154.55835: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000020 15980 1727204154.56463: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000020 15980 1727204154.56469: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15980 1727204154.56519: no more pending results, returning what we have 15980 1727204154.56522: results queue empty 15980 1727204154.56523: checking for any_errors_fatal 15980 1727204154.56530: done checking for any_errors_fatal 15980 1727204154.56531: checking for max_fail_percentage 15980 1727204154.56532: done checking for max_fail_percentage 15980 1727204154.56533: checking to see if all hosts have failed and the running result is not ok 15980 1727204154.56534: done checking to see if all hosts have failed 15980 1727204154.56535: getting the remaining hosts for this loop 15980 1727204154.56537: done getting the remaining hosts for this loop 15980 1727204154.56540: getting the next task for host managed-node2 15980 1727204154.56546: done getting next task for host managed-node2 15980 1727204154.56549: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15980 1727204154.56551: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204154.56561: getting variables 15980 1727204154.56562: in VariableManager get_vars() 15980 1727204154.56600: Calling all_inventory to load vars for managed-node2 15980 1727204154.56603: Calling groups_inventory to load vars for managed-node2 15980 1727204154.56606: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204154.56616: Calling all_plugins_play to load vars for managed-node2 15980 1727204154.56620: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204154.56623: Calling groups_plugins_play to load vars for managed-node2 15980 1727204154.60544: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204154.65297: done with get_vars() 15980 1727204154.65338: done getting variables 15980 1727204154.65405: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:55:54 -0400 (0:00:01.246) 0:00:16.064 ***** 15980 1727204154.65441: entering _queue_task() for managed-node2/service 15980 1727204154.66214: worker is 1 (out of 1 available) 15980 1727204154.66231: exiting _queue_task() for managed-node2/service 15980 1727204154.66244: done queuing things up, now waiting for results queue to drain 15980 1727204154.66246: waiting for pending results... 15980 1727204154.67192: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15980 1727204154.67198: in run() - task 127b8e07-fff9-5f1d-4b72-000000000021 15980 1727204154.67201: variable 'ansible_search_path' from source: unknown 15980 1727204154.67203: variable 'ansible_search_path' from source: unknown 15980 1727204154.67206: calling self._execute() 15980 1727204154.67285: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204154.67290: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204154.67293: variable 'omit' from source: magic vars 15980 1727204154.68106: variable 'ansible_distribution_major_version' from source: facts 15980 1727204154.68376: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204154.68522: variable 'network_provider' from source: set_fact 15980 1727204154.68536: Evaluated conditional (network_provider == "nm"): True 15980 1727204154.68670: variable '__network_wpa_supplicant_required' from source: role '' defaults 15980 1727204154.68940: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15980 1727204154.69401: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15980 1727204154.74477: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15980 1727204154.74483: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15980 1727204154.74709: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15980 1727204154.74713: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15980 1727204154.74716: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15980 1727204154.75309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204154.75347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204154.75573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204154.75577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204154.75580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204154.75633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204154.75820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204154.75852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204154.75906: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204154.75988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204154.76047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204154.76146: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204154.76181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204154.76273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204154.76354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204154.76631: variable 'network_connections' from source: play vars 15980 1727204154.76873: variable 'interface' from source: set_fact 15980 1727204154.77171: variable 'interface' from source: set_fact 15980 1727204154.77174: variable 'interface' from source: set_fact 15980 1727204154.77177: variable 'interface' from source: set_fact 15980 1727204154.77179: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15980 1727204154.77651: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15980 1727204154.77777: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15980 1727204154.77816: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15980 1727204154.77983: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15980 1727204154.78040: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15980 1727204154.78189: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15980 1727204154.78223: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204154.78255: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15980 1727204154.78316: variable '__network_wireless_connections_defined' from source: role '' defaults 15980 1727204154.78997: variable 'network_connections' from source: play vars 15980 1727204154.79045: variable 'interface' from source: set_fact 15980 1727204154.79124: variable 'interface' from source: set_fact 15980 1727204154.79263: variable 'interface' from source: set_fact 15980 1727204154.79337: variable 'interface' from source: set_fact 15980 1727204154.79583: Evaluated conditional (__network_wpa_supplicant_required): False 15980 1727204154.79587: when evaluation is False, skipping this task 15980 1727204154.79589: _execute() done 15980 1727204154.79603: dumping result to json 15980 1727204154.79605: done dumping result, returning 15980 1727204154.79609: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [127b8e07-fff9-5f1d-4b72-000000000021] 15980 1727204154.79612: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000021 15980 1727204154.79847: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000021 15980 1727204154.79851: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 15980 1727204154.79903: no more pending results, returning what we have 15980 1727204154.79907: results queue empty 15980 1727204154.79908: checking for any_errors_fatal 15980 1727204154.79937: done checking for any_errors_fatal 15980 1727204154.79939: checking for max_fail_percentage 15980 1727204154.79940: done checking for max_fail_percentage 15980 1727204154.79941: checking to see if all hosts have failed and the running result is not ok 15980 1727204154.79942: done checking to see if all hosts have failed 15980 1727204154.79943: getting the remaining hosts for this loop 15980 1727204154.79945: done getting the remaining hosts for this loop 15980 1727204154.79950: getting the next task for host managed-node2 15980 1727204154.79957: done getting next task for host managed-node2 15980 1727204154.79961: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 15980 1727204154.79964: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204154.79980: getting variables 15980 1727204154.79982: in VariableManager get_vars() 15980 1727204154.80022: Calling all_inventory to load vars for managed-node2 15980 1727204154.80025: Calling groups_inventory to load vars for managed-node2 15980 1727204154.80029: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204154.80040: Calling all_plugins_play to load vars for managed-node2 15980 1727204154.80043: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204154.80045: Calling groups_plugins_play to load vars for managed-node2 15980 1727204154.83989: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204154.88421: done with get_vars() 15980 1727204154.88463: done getting variables 15980 1727204154.88529: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:55:54 -0400 (0:00:00.231) 0:00:16.295 ***** 15980 1727204154.88559: entering _queue_task() for managed-node2/service 15980 1727204154.89325: worker is 1 (out of 1 available) 15980 1727204154.89344: exiting _queue_task() for managed-node2/service 15980 1727204154.89357: done queuing things up, now waiting for results queue to drain 15980 1727204154.89359: waiting for pending results... 15980 1727204154.90016: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 15980 1727204154.90136: in run() - task 127b8e07-fff9-5f1d-4b72-000000000022 15980 1727204154.90273: variable 'ansible_search_path' from source: unknown 15980 1727204154.90277: variable 'ansible_search_path' from source: unknown 15980 1727204154.90296: calling self._execute() 15980 1727204154.90522: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204154.90557: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204154.90671: variable 'omit' from source: magic vars 15980 1727204154.91567: variable 'ansible_distribution_major_version' from source: facts 15980 1727204154.91600: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204154.91919: variable 'network_provider' from source: set_fact 15980 1727204154.91957: Evaluated conditional (network_provider == "initscripts"): False 15980 1727204154.92074: when evaluation is False, skipping this task 15980 1727204154.92078: _execute() done 15980 1727204154.92081: dumping result to json 15980 1727204154.92084: done dumping result, returning 15980 1727204154.92087: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [127b8e07-fff9-5f1d-4b72-000000000022] 15980 1727204154.92089: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000022 15980 1727204154.92378: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000022 15980 1727204154.92382: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15980 1727204154.92433: no more pending results, returning what we have 15980 1727204154.92438: results queue empty 15980 1727204154.92439: checking for any_errors_fatal 15980 1727204154.92453: done checking for any_errors_fatal 15980 1727204154.92454: checking for max_fail_percentage 15980 1727204154.92455: done checking for max_fail_percentage 15980 1727204154.92456: checking to see if all hosts have failed and the running result is not ok 15980 1727204154.92457: done checking to see if all hosts have failed 15980 1727204154.92458: getting the remaining hosts for this loop 15980 1727204154.92460: done getting the remaining hosts for this loop 15980 1727204154.92464: getting the next task for host managed-node2 15980 1727204154.92673: done getting next task for host managed-node2 15980 1727204154.92679: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15980 1727204154.92681: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204154.92697: getting variables 15980 1727204154.92698: in VariableManager get_vars() 15980 1727204154.92737: Calling all_inventory to load vars for managed-node2 15980 1727204154.92740: Calling groups_inventory to load vars for managed-node2 15980 1727204154.92743: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204154.92754: Calling all_plugins_play to load vars for managed-node2 15980 1727204154.92757: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204154.92761: Calling groups_plugins_play to load vars for managed-node2 15980 1727204154.95489: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204154.98436: done with get_vars() 15980 1727204154.98474: done getting variables 15980 1727204154.98545: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:55:54 -0400 (0:00:00.101) 0:00:16.397 ***** 15980 1727204154.98711: entering _queue_task() for managed-node2/copy 15980 1727204154.99123: worker is 1 (out of 1 available) 15980 1727204154.99141: exiting _queue_task() for managed-node2/copy 15980 1727204154.99155: done queuing things up, now waiting for results queue to drain 15980 1727204154.99158: waiting for pending results... 15980 1727204154.99490: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15980 1727204154.99525: in run() - task 127b8e07-fff9-5f1d-4b72-000000000023 15980 1727204154.99547: variable 'ansible_search_path' from source: unknown 15980 1727204154.99620: variable 'ansible_search_path' from source: unknown 15980 1727204154.99624: calling self._execute() 15980 1727204154.99717: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204154.99739: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204154.99754: variable 'omit' from source: magic vars 15980 1727204155.00189: variable 'ansible_distribution_major_version' from source: facts 15980 1727204155.00210: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204155.00347: variable 'network_provider' from source: set_fact 15980 1727204155.00359: Evaluated conditional (network_provider == "initscripts"): False 15980 1727204155.00371: when evaluation is False, skipping this task 15980 1727204155.00492: _execute() done 15980 1727204155.00496: dumping result to json 15980 1727204155.00499: done dumping result, returning 15980 1727204155.00502: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [127b8e07-fff9-5f1d-4b72-000000000023] 15980 1727204155.00505: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000023 15980 1727204155.00590: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000023 15980 1727204155.00594: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 15980 1727204155.00649: no more pending results, returning what we have 15980 1727204155.00653: results queue empty 15980 1727204155.00654: checking for any_errors_fatal 15980 1727204155.00659: done checking for any_errors_fatal 15980 1727204155.00660: checking for max_fail_percentage 15980 1727204155.00662: done checking for max_fail_percentage 15980 1727204155.00663: checking to see if all hosts have failed and the running result is not ok 15980 1727204155.00664: done checking to see if all hosts have failed 15980 1727204155.00665: getting the remaining hosts for this loop 15980 1727204155.00668: done getting the remaining hosts for this loop 15980 1727204155.00672: getting the next task for host managed-node2 15980 1727204155.00678: done getting next task for host managed-node2 15980 1727204155.00683: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15980 1727204155.00685: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204155.00701: getting variables 15980 1727204155.00702: in VariableManager get_vars() 15980 1727204155.00745: Calling all_inventory to load vars for managed-node2 15980 1727204155.00748: Calling groups_inventory to load vars for managed-node2 15980 1727204155.00750: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204155.00763: Calling all_plugins_play to load vars for managed-node2 15980 1727204155.00893: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204155.00899: Calling groups_plugins_play to load vars for managed-node2 15980 1727204155.02787: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204155.05045: done with get_vars() 15980 1727204155.05088: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:55:55 -0400 (0:00:00.064) 0:00:16.462 ***** 15980 1727204155.05191: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 15980 1727204155.05193: Creating lock for fedora.linux_system_roles.network_connections 15980 1727204155.05607: worker is 1 (out of 1 available) 15980 1727204155.05622: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 15980 1727204155.05635: done queuing things up, now waiting for results queue to drain 15980 1727204155.05637: waiting for pending results... 15980 1727204155.06091: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15980 1727204155.06097: in run() - task 127b8e07-fff9-5f1d-4b72-000000000024 15980 1727204155.06189: variable 'ansible_search_path' from source: unknown 15980 1727204155.06193: variable 'ansible_search_path' from source: unknown 15980 1727204155.06196: calling self._execute() 15980 1727204155.06268: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204155.06281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204155.06305: variable 'omit' from source: magic vars 15980 1727204155.06746: variable 'ansible_distribution_major_version' from source: facts 15980 1727204155.06767: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204155.06780: variable 'omit' from source: magic vars 15980 1727204155.06830: variable 'omit' from source: magic vars 15980 1727204155.07036: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15980 1727204155.09619: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15980 1727204155.09790: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15980 1727204155.09794: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15980 1727204155.09802: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15980 1727204155.09834: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15980 1727204155.09940: variable 'network_provider' from source: set_fact 15980 1727204155.10099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204155.10552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204155.10662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204155.10668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204155.10671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204155.10767: variable 'omit' from source: magic vars 15980 1727204155.10931: variable 'omit' from source: magic vars 15980 1727204155.11101: variable 'network_connections' from source: play vars 15980 1727204155.11106: variable 'interface' from source: set_fact 15980 1727204155.11211: variable 'interface' from source: set_fact 15980 1727204155.11216: variable 'interface' from source: set_fact 15980 1727204155.11251: variable 'interface' from source: set_fact 15980 1727204155.11450: variable 'omit' from source: magic vars 15980 1727204155.11671: variable '__lsr_ansible_managed' from source: task vars 15980 1727204155.11674: variable '__lsr_ansible_managed' from source: task vars 15980 1727204155.11742: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 15980 1727204155.11988: Loaded config def from plugin (lookup/template) 15980 1727204155.11997: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 15980 1727204155.12040: File lookup term: get_ansible_managed.j2 15980 1727204155.12047: variable 'ansible_search_path' from source: unknown 15980 1727204155.12059: evaluation_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 15980 1727204155.12079: search_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 15980 1727204155.12103: variable 'ansible_search_path' from source: unknown 15980 1727204155.19858: variable 'ansible_managed' from source: unknown 15980 1727204155.20064: variable 'omit' from source: magic vars 15980 1727204155.20111: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204155.20149: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204155.20177: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204155.20222: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204155.20230: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204155.20332: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204155.20337: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204155.20339: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204155.20392: Set connection var ansible_connection to ssh 15980 1727204155.20409: Set connection var ansible_pipelining to False 15980 1727204155.20440: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204155.20443: Set connection var ansible_timeout to 10 15980 1727204155.20445: Set connection var ansible_shell_type to sh 15980 1727204155.20459: Set connection var ansible_shell_executable to /bin/sh 15980 1727204155.20549: variable 'ansible_shell_executable' from source: unknown 15980 1727204155.20554: variable 'ansible_connection' from source: unknown 15980 1727204155.20561: variable 'ansible_module_compression' from source: unknown 15980 1727204155.20567: variable 'ansible_shell_type' from source: unknown 15980 1727204155.20570: variable 'ansible_shell_executable' from source: unknown 15980 1727204155.20572: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204155.20574: variable 'ansible_pipelining' from source: unknown 15980 1727204155.20577: variable 'ansible_timeout' from source: unknown 15980 1727204155.20579: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204155.20707: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15980 1727204155.20734: variable 'omit' from source: magic vars 15980 1727204155.20771: starting attempt loop 15980 1727204155.20778: running the handler 15980 1727204155.20875: _low_level_execute_command(): starting 15980 1727204155.20879: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15980 1727204155.21610: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204155.21651: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204155.21756: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204155.21793: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204155.21812: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204155.21919: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204155.23715: stdout chunk (state=3): >>>/root <<< 15980 1727204155.23872: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204155.24142: stderr chunk (state=3): >>><<< 15980 1727204155.24146: stdout chunk (state=3): >>><<< 15980 1727204155.24149: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204155.24152: _low_level_execute_command(): starting 15980 1727204155.24155: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204155.2407997-17692-42004185568501 `" && echo ansible-tmp-1727204155.2407997-17692-42004185568501="` echo /root/.ansible/tmp/ansible-tmp-1727204155.2407997-17692-42004185568501 `" ) && sleep 0' 15980 1727204155.25603: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204155.25609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204155.25612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204155.25782: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204155.25907: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204155.25985: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204155.28067: stdout chunk (state=3): >>>ansible-tmp-1727204155.2407997-17692-42004185568501=/root/.ansible/tmp/ansible-tmp-1727204155.2407997-17692-42004185568501 <<< 15980 1727204155.28251: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204155.28278: stderr chunk (state=3): >>><<< 15980 1727204155.28281: stdout chunk (state=3): >>><<< 15980 1727204155.28524: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204155.2407997-17692-42004185568501=/root/.ansible/tmp/ansible-tmp-1727204155.2407997-17692-42004185568501 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204155.28528: variable 'ansible_module_compression' from source: unknown 15980 1727204155.28672: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 15980 1727204155.28676: ANSIBALLZ: Acquiring lock 15980 1727204155.28678: ANSIBALLZ: Lock acquired: 139981196972096 15980 1727204155.28680: ANSIBALLZ: Creating module 15980 1727204155.63922: ANSIBALLZ: Writing module into payload 15980 1727204155.64281: ANSIBALLZ: Writing module 15980 1727204155.64313: ANSIBALLZ: Renaming module 15980 1727204155.64319: ANSIBALLZ: Done creating module 15980 1727204155.64354: variable 'ansible_facts' from source: unknown 15980 1727204155.64475: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204155.2407997-17692-42004185568501/AnsiballZ_network_connections.py 15980 1727204155.64688: Sending initial data 15980 1727204155.64691: Sent initial data (167 bytes) 15980 1727204155.65381: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204155.65427: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204155.65523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204155.65664: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204155.65672: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204155.65701: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204155.65871: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204155.67581: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15980 1727204155.67649: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15980 1727204155.67740: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15980vtkmnsww/tmphtsfq3t0 /root/.ansible/tmp/ansible-tmp-1727204155.2407997-17692-42004185568501/AnsiballZ_network_connections.py <<< 15980 1727204155.67744: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204155.2407997-17692-42004185568501/AnsiballZ_network_connections.py" <<< 15980 1727204155.67819: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15980vtkmnsww/tmphtsfq3t0" to remote "/root/.ansible/tmp/ansible-tmp-1727204155.2407997-17692-42004185568501/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204155.2407997-17692-42004185568501/AnsiballZ_network_connections.py" <<< 15980 1727204155.68998: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204155.69219: stderr chunk (state=3): >>><<< 15980 1727204155.69222: stdout chunk (state=3): >>><<< 15980 1727204155.69227: done transferring module to remote 15980 1727204155.69230: _low_level_execute_command(): starting 15980 1727204155.69233: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204155.2407997-17692-42004185568501/ /root/.ansible/tmp/ansible-tmp-1727204155.2407997-17692-42004185568501/AnsiballZ_network_connections.py && sleep 0' 15980 1727204155.69807: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204155.69853: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204155.69873: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204155.69974: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204155.72208: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204155.72212: stdout chunk (state=3): >>><<< 15980 1727204155.72215: stderr chunk (state=3): >>><<< 15980 1727204155.72217: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204155.72220: _low_level_execute_command(): starting 15980 1727204155.72222: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204155.2407997-17692-42004185568501/AnsiballZ_network_connections.py && sleep 0' 15980 1727204155.74018: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204155.74053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204155.74213: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204155.74301: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204155.74310: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204155.74436: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204156.08197: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, 17a7d1a5-4da5-45e1-8ef4-6d7b416254ea\n[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, 17a7d1a5-4da5-45e1-8ef4-6d7b416254ea (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "interface_name": "LSR-TST-br31", "state": "up", "type": "bridge", "ip": {"dhcp4": false, "auto6": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "interface_name": "LSR-TST-br31", "state": "up", "type": "bridge", "ip": {"dhcp4": false, "auto6": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 15980 1727204156.11252: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 15980 1727204156.11352: stderr chunk (state=3): >>><<< 15980 1727204156.11356: stdout chunk (state=3): >>><<< 15980 1727204156.11559: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, 17a7d1a5-4da5-45e1-8ef4-6d7b416254ea\n[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, 17a7d1a5-4da5-45e1-8ef4-6d7b416254ea (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "interface_name": "LSR-TST-br31", "state": "up", "type": "bridge", "ip": {"dhcp4": false, "auto6": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "interface_name": "LSR-TST-br31", "state": "up", "type": "bridge", "ip": {"dhcp4": false, "auto6": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 15980 1727204156.11563: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'LSR-TST-br31', 'interface_name': 'LSR-TST-br31', 'state': 'up', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': True}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204155.2407997-17692-42004185568501/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15980 1727204156.11568: _low_level_execute_command(): starting 15980 1727204156.11571: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204155.2407997-17692-42004185568501/ > /dev/null 2>&1 && sleep 0' 15980 1727204156.12275: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204156.12290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204156.12301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204156.12350: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204156.12371: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204156.12468: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204156.14821: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204156.14825: stdout chunk (state=3): >>><<< 15980 1727204156.14828: stderr chunk (state=3): >>><<< 15980 1727204156.14830: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204156.14884: handler run complete 15980 1727204156.14890: attempt loop complete, returning result 15980 1727204156.14892: _execute() done 15980 1727204156.14895: dumping result to json 15980 1727204156.14897: done dumping result, returning 15980 1727204156.14929: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [127b8e07-fff9-5f1d-4b72-000000000024] 15980 1727204156.14932: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000024 15980 1727204156.15145: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000024 15980 1727204156.15150: WORKER PROCESS EXITING changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "interface_name": "LSR-TST-br31", "ip": { "auto6": true, "dhcp4": false }, "name": "LSR-TST-br31", "state": "up", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, 17a7d1a5-4da5-45e1-8ef4-6d7b416254ea [004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, 17a7d1a5-4da5-45e1-8ef4-6d7b416254ea (not-active) 15980 1727204156.15277: no more pending results, returning what we have 15980 1727204156.15281: results queue empty 15980 1727204156.15282: checking for any_errors_fatal 15980 1727204156.15294: done checking for any_errors_fatal 15980 1727204156.15295: checking for max_fail_percentage 15980 1727204156.15297: done checking for max_fail_percentage 15980 1727204156.15298: checking to see if all hosts have failed and the running result is not ok 15980 1727204156.15299: done checking to see if all hosts have failed 15980 1727204156.15300: getting the remaining hosts for this loop 15980 1727204156.15302: done getting the remaining hosts for this loop 15980 1727204156.15306: getting the next task for host managed-node2 15980 1727204156.15313: done getting next task for host managed-node2 15980 1727204156.15317: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 15980 1727204156.15319: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204156.15331: getting variables 15980 1727204156.15332: in VariableManager get_vars() 15980 1727204156.15554: Calling all_inventory to load vars for managed-node2 15980 1727204156.15558: Calling groups_inventory to load vars for managed-node2 15980 1727204156.15560: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204156.15574: Calling all_plugins_play to load vars for managed-node2 15980 1727204156.15577: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204156.15579: Calling groups_plugins_play to load vars for managed-node2 15980 1727204156.19743: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204156.20968: done with get_vars() 15980 1727204156.20994: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:55:56 -0400 (0:00:01.158) 0:00:17.620 ***** 15980 1727204156.21070: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 15980 1727204156.21072: Creating lock for fedora.linux_system_roles.network_state 15980 1727204156.21364: worker is 1 (out of 1 available) 15980 1727204156.21380: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 15980 1727204156.21392: done queuing things up, now waiting for results queue to drain 15980 1727204156.21394: waiting for pending results... 15980 1727204156.21807: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 15980 1727204156.21812: in run() - task 127b8e07-fff9-5f1d-4b72-000000000025 15980 1727204156.21832: variable 'ansible_search_path' from source: unknown 15980 1727204156.21836: variable 'ansible_search_path' from source: unknown 15980 1727204156.21873: calling self._execute() 15980 1727204156.22031: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204156.22094: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204156.22123: variable 'omit' from source: magic vars 15980 1727204156.23139: variable 'ansible_distribution_major_version' from source: facts 15980 1727204156.23211: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204156.23470: variable 'network_state' from source: role '' defaults 15980 1727204156.23474: Evaluated conditional (network_state != {}): False 15980 1727204156.23477: when evaluation is False, skipping this task 15980 1727204156.23479: _execute() done 15980 1727204156.23482: dumping result to json 15980 1727204156.23484: done dumping result, returning 15980 1727204156.23487: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [127b8e07-fff9-5f1d-4b72-000000000025] 15980 1727204156.23490: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000025 15980 1727204156.23598: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000025 15980 1727204156.23601: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15980 1727204156.23664: no more pending results, returning what we have 15980 1727204156.23671: results queue empty 15980 1727204156.23672: checking for any_errors_fatal 15980 1727204156.23692: done checking for any_errors_fatal 15980 1727204156.23693: checking for max_fail_percentage 15980 1727204156.23694: done checking for max_fail_percentage 15980 1727204156.23695: checking to see if all hosts have failed and the running result is not ok 15980 1727204156.23698: done checking to see if all hosts have failed 15980 1727204156.23699: getting the remaining hosts for this loop 15980 1727204156.23701: done getting the remaining hosts for this loop 15980 1727204156.23709: getting the next task for host managed-node2 15980 1727204156.23719: done getting next task for host managed-node2 15980 1727204156.23722: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15980 1727204156.23728: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204156.23752: getting variables 15980 1727204156.23754: in VariableManager get_vars() 15980 1727204156.24124: Calling all_inventory to load vars for managed-node2 15980 1727204156.24129: Calling groups_inventory to load vars for managed-node2 15980 1727204156.24132: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204156.24143: Calling all_plugins_play to load vars for managed-node2 15980 1727204156.24145: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204156.24148: Calling groups_plugins_play to load vars for managed-node2 15980 1727204156.26832: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204156.30054: done with get_vars() 15980 1727204156.30088: done getting variables 15980 1727204156.30249: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:55:56 -0400 (0:00:00.092) 0:00:17.713 ***** 15980 1727204156.30351: entering _queue_task() for managed-node2/debug 15980 1727204156.31307: worker is 1 (out of 1 available) 15980 1727204156.31321: exiting _queue_task() for managed-node2/debug 15980 1727204156.31332: done queuing things up, now waiting for results queue to drain 15980 1727204156.31334: waiting for pending results... 15980 1727204156.31714: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15980 1727204156.31721: in run() - task 127b8e07-fff9-5f1d-4b72-000000000026 15980 1727204156.31725: variable 'ansible_search_path' from source: unknown 15980 1727204156.31730: variable 'ansible_search_path' from source: unknown 15980 1727204156.31754: calling self._execute() 15980 1727204156.31861: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204156.31869: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204156.31879: variable 'omit' from source: magic vars 15980 1727204156.32833: variable 'ansible_distribution_major_version' from source: facts 15980 1727204156.32855: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204156.32870: variable 'omit' from source: magic vars 15980 1727204156.33001: variable 'omit' from source: magic vars 15980 1727204156.33005: variable 'omit' from source: magic vars 15980 1727204156.33023: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204156.33065: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204156.33115: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204156.33138: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204156.33154: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204156.33191: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204156.33200: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204156.33209: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204156.33435: Set connection var ansible_connection to ssh 15980 1727204156.33438: Set connection var ansible_pipelining to False 15980 1727204156.33441: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204156.33443: Set connection var ansible_timeout to 10 15980 1727204156.33445: Set connection var ansible_shell_type to sh 15980 1727204156.33447: Set connection var ansible_shell_executable to /bin/sh 15980 1727204156.33449: variable 'ansible_shell_executable' from source: unknown 15980 1727204156.33451: variable 'ansible_connection' from source: unknown 15980 1727204156.33453: variable 'ansible_module_compression' from source: unknown 15980 1727204156.33455: variable 'ansible_shell_type' from source: unknown 15980 1727204156.33457: variable 'ansible_shell_executable' from source: unknown 15980 1727204156.33459: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204156.33461: variable 'ansible_pipelining' from source: unknown 15980 1727204156.33463: variable 'ansible_timeout' from source: unknown 15980 1727204156.33468: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204156.33614: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15980 1727204156.33634: variable 'omit' from source: magic vars 15980 1727204156.33651: starting attempt loop 15980 1727204156.33654: running the handler 15980 1727204156.33869: variable '__network_connections_result' from source: set_fact 15980 1727204156.33879: handler run complete 15980 1727204156.33903: attempt loop complete, returning result 15980 1727204156.33910: _execute() done 15980 1727204156.33917: dumping result to json 15980 1727204156.33924: done dumping result, returning 15980 1727204156.33937: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [127b8e07-fff9-5f1d-4b72-000000000026] 15980 1727204156.33946: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000026 15980 1727204156.34278: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000026 15980 1727204156.34283: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, 17a7d1a5-4da5-45e1-8ef4-6d7b416254ea", "[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, 17a7d1a5-4da5-45e1-8ef4-6d7b416254ea (not-active)" ] } 15980 1727204156.34341: no more pending results, returning what we have 15980 1727204156.34344: results queue empty 15980 1727204156.34345: checking for any_errors_fatal 15980 1727204156.34350: done checking for any_errors_fatal 15980 1727204156.34351: checking for max_fail_percentage 15980 1727204156.34353: done checking for max_fail_percentage 15980 1727204156.34354: checking to see if all hosts have failed and the running result is not ok 15980 1727204156.34355: done checking to see if all hosts have failed 15980 1727204156.34355: getting the remaining hosts for this loop 15980 1727204156.34357: done getting the remaining hosts for this loop 15980 1727204156.34360: getting the next task for host managed-node2 15980 1727204156.34370: done getting next task for host managed-node2 15980 1727204156.34373: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15980 1727204156.34375: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204156.34389: getting variables 15980 1727204156.34391: in VariableManager get_vars() 15980 1727204156.34482: Calling all_inventory to load vars for managed-node2 15980 1727204156.34485: Calling groups_inventory to load vars for managed-node2 15980 1727204156.34488: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204156.34502: Calling all_plugins_play to load vars for managed-node2 15980 1727204156.34518: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204156.34532: Calling groups_plugins_play to load vars for managed-node2 15980 1727204156.36192: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204156.37371: done with get_vars() 15980 1727204156.37398: done getting variables 15980 1727204156.37453: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:55:56 -0400 (0:00:00.071) 0:00:17.785 ***** 15980 1727204156.37480: entering _queue_task() for managed-node2/debug 15980 1727204156.37902: worker is 1 (out of 1 available) 15980 1727204156.37915: exiting _queue_task() for managed-node2/debug 15980 1727204156.37931: done queuing things up, now waiting for results queue to drain 15980 1727204156.37933: waiting for pending results... 15980 1727204156.38284: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15980 1727204156.38776: in run() - task 127b8e07-fff9-5f1d-4b72-000000000027 15980 1727204156.38780: variable 'ansible_search_path' from source: unknown 15980 1727204156.38783: variable 'ansible_search_path' from source: unknown 15980 1727204156.38785: calling self._execute() 15980 1727204156.38963: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204156.38982: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204156.38997: variable 'omit' from source: magic vars 15980 1727204156.39431: variable 'ansible_distribution_major_version' from source: facts 15980 1727204156.39456: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204156.39471: variable 'omit' from source: magic vars 15980 1727204156.39520: variable 'omit' from source: magic vars 15980 1727204156.39571: variable 'omit' from source: magic vars 15980 1727204156.39621: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204156.39672: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204156.39700: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204156.39723: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204156.39740: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204156.39783: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204156.39793: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204156.39802: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204156.39906: Set connection var ansible_connection to ssh 15980 1727204156.39921: Set connection var ansible_pipelining to False 15980 1727204156.39933: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204156.39945: Set connection var ansible_timeout to 10 15980 1727204156.39955: Set connection var ansible_shell_type to sh 15980 1727204156.39967: Set connection var ansible_shell_executable to /bin/sh 15980 1727204156.40002: variable 'ansible_shell_executable' from source: unknown 15980 1727204156.40009: variable 'ansible_connection' from source: unknown 15980 1727204156.40016: variable 'ansible_module_compression' from source: unknown 15980 1727204156.40021: variable 'ansible_shell_type' from source: unknown 15980 1727204156.40026: variable 'ansible_shell_executable' from source: unknown 15980 1727204156.40032: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204156.40038: variable 'ansible_pipelining' from source: unknown 15980 1727204156.40043: variable 'ansible_timeout' from source: unknown 15980 1727204156.40049: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204156.40205: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15980 1727204156.40221: variable 'omit' from source: magic vars 15980 1727204156.40231: starting attempt loop 15980 1727204156.40236: running the handler 15980 1727204156.40524: variable '__network_connections_result' from source: set_fact 15980 1727204156.40549: variable '__network_connections_result' from source: set_fact 15980 1727204156.40825: handler run complete 15980 1727204156.40994: attempt loop complete, returning result 15980 1727204156.41003: _execute() done 15980 1727204156.41010: dumping result to json 15980 1727204156.41019: done dumping result, returning 15980 1727204156.41033: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [127b8e07-fff9-5f1d-4b72-000000000027] 15980 1727204156.41043: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000027 ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "interface_name": "LSR-TST-br31", "ip": { "auto6": true, "dhcp4": false }, "name": "LSR-TST-br31", "state": "up", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, 17a7d1a5-4da5-45e1-8ef4-6d7b416254ea\n[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, 17a7d1a5-4da5-45e1-8ef4-6d7b416254ea (not-active)\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, 17a7d1a5-4da5-45e1-8ef4-6d7b416254ea", "[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, 17a7d1a5-4da5-45e1-8ef4-6d7b416254ea (not-active)" ] } } 15980 1727204156.41479: no more pending results, returning what we have 15980 1727204156.41483: results queue empty 15980 1727204156.41484: checking for any_errors_fatal 15980 1727204156.41492: done checking for any_errors_fatal 15980 1727204156.41493: checking for max_fail_percentage 15980 1727204156.41495: done checking for max_fail_percentage 15980 1727204156.41496: checking to see if all hosts have failed and the running result is not ok 15980 1727204156.41497: done checking to see if all hosts have failed 15980 1727204156.41499: getting the remaining hosts for this loop 15980 1727204156.41500: done getting the remaining hosts for this loop 15980 1727204156.41505: getting the next task for host managed-node2 15980 1727204156.41512: done getting next task for host managed-node2 15980 1727204156.41516: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15980 1727204156.41518: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204156.41530: getting variables 15980 1727204156.41532: in VariableManager get_vars() 15980 1727204156.41782: Calling all_inventory to load vars for managed-node2 15980 1727204156.41786: Calling groups_inventory to load vars for managed-node2 15980 1727204156.41789: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204156.41801: Calling all_plugins_play to load vars for managed-node2 15980 1727204156.41804: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204156.41808: Calling groups_plugins_play to load vars for managed-node2 15980 1727204156.42677: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000027 15980 1727204156.42681: WORKER PROCESS EXITING 15980 1727204156.45806: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204156.48814: done with get_vars() 15980 1727204156.48855: done getting variables 15980 1727204156.48931: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:55:56 -0400 (0:00:00.114) 0:00:17.899 ***** 15980 1727204156.48968: entering _queue_task() for managed-node2/debug 15980 1727204156.49477: worker is 1 (out of 1 available) 15980 1727204156.49489: exiting _queue_task() for managed-node2/debug 15980 1727204156.49502: done queuing things up, now waiting for results queue to drain 15980 1727204156.49504: waiting for pending results... 15980 1727204156.49729: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15980 1727204156.49871: in run() - task 127b8e07-fff9-5f1d-4b72-000000000028 15980 1727204156.49900: variable 'ansible_search_path' from source: unknown 15980 1727204156.49909: variable 'ansible_search_path' from source: unknown 15980 1727204156.49959: calling self._execute() 15980 1727204156.50080: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204156.50095: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204156.50113: variable 'omit' from source: magic vars 15980 1727204156.50647: variable 'ansible_distribution_major_version' from source: facts 15980 1727204156.50671: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204156.50986: variable 'network_state' from source: role '' defaults 15980 1727204156.51052: Evaluated conditional (network_state != {}): False 15980 1727204156.51063: when evaluation is False, skipping this task 15980 1727204156.51076: _execute() done 15980 1727204156.51098: dumping result to json 15980 1727204156.51106: done dumping result, returning 15980 1727204156.51128: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [127b8e07-fff9-5f1d-4b72-000000000028] 15980 1727204156.51140: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000028 skipping: [managed-node2] => { "false_condition": "network_state != {}" } 15980 1727204156.51476: no more pending results, returning what we have 15980 1727204156.51480: results queue empty 15980 1727204156.51481: checking for any_errors_fatal 15980 1727204156.51495: done checking for any_errors_fatal 15980 1727204156.51496: checking for max_fail_percentage 15980 1727204156.51499: done checking for max_fail_percentage 15980 1727204156.51500: checking to see if all hosts have failed and the running result is not ok 15980 1727204156.51502: done checking to see if all hosts have failed 15980 1727204156.51502: getting the remaining hosts for this loop 15980 1727204156.51504: done getting the remaining hosts for this loop 15980 1727204156.51509: getting the next task for host managed-node2 15980 1727204156.51516: done getting next task for host managed-node2 15980 1727204156.51523: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 15980 1727204156.51527: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204156.51544: getting variables 15980 1727204156.51546: in VariableManager get_vars() 15980 1727204156.51594: Calling all_inventory to load vars for managed-node2 15980 1727204156.51597: Calling groups_inventory to load vars for managed-node2 15980 1727204156.51600: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204156.51616: Calling all_plugins_play to load vars for managed-node2 15980 1727204156.51619: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204156.51622: Calling groups_plugins_play to load vars for managed-node2 15980 1727204156.52186: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000028 15980 1727204156.52191: WORKER PROCESS EXITING 15980 1727204156.53722: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204156.56927: done with get_vars() 15980 1727204156.56972: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:55:56 -0400 (0:00:00.082) 0:00:17.982 ***** 15980 1727204156.57240: entering _queue_task() for managed-node2/ping 15980 1727204156.57242: Creating lock for ping 15980 1727204156.58049: worker is 1 (out of 1 available) 15980 1727204156.58063: exiting _queue_task() for managed-node2/ping 15980 1727204156.58080: done queuing things up, now waiting for results queue to drain 15980 1727204156.58082: waiting for pending results... 15980 1727204156.58685: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 15980 1727204156.59252: in run() - task 127b8e07-fff9-5f1d-4b72-000000000029 15980 1727204156.59258: variable 'ansible_search_path' from source: unknown 15980 1727204156.59261: variable 'ansible_search_path' from source: unknown 15980 1727204156.59263: calling self._execute() 15980 1727204156.59333: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204156.59369: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204156.59575: variable 'omit' from source: magic vars 15980 1727204156.60307: variable 'ansible_distribution_major_version' from source: facts 15980 1727204156.60388: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204156.60401: variable 'omit' from source: magic vars 15980 1727204156.60589: variable 'omit' from source: magic vars 15980 1727204156.60635: variable 'omit' from source: magic vars 15980 1727204156.60708: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204156.60817: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204156.60915: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204156.60942: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204156.61012: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204156.61050: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204156.61119: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204156.61130: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204156.61370: Set connection var ansible_connection to ssh 15980 1727204156.61411: Set connection var ansible_pipelining to False 15980 1727204156.61450: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204156.61460: Set connection var ansible_timeout to 10 15980 1727204156.61531: Set connection var ansible_shell_type to sh 15980 1727204156.61534: Set connection var ansible_shell_executable to /bin/sh 15980 1727204156.61640: variable 'ansible_shell_executable' from source: unknown 15980 1727204156.61643: variable 'ansible_connection' from source: unknown 15980 1727204156.61646: variable 'ansible_module_compression' from source: unknown 15980 1727204156.61648: variable 'ansible_shell_type' from source: unknown 15980 1727204156.61652: variable 'ansible_shell_executable' from source: unknown 15980 1727204156.61654: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204156.61656: variable 'ansible_pipelining' from source: unknown 15980 1727204156.61658: variable 'ansible_timeout' from source: unknown 15980 1727204156.61660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204156.62319: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15980 1727204156.62324: variable 'omit' from source: magic vars 15980 1727204156.62327: starting attempt loop 15980 1727204156.62329: running the handler 15980 1727204156.62332: _low_level_execute_command(): starting 15980 1727204156.62334: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15980 1727204156.63694: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204156.63704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 15980 1727204156.63909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204156.63913: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204156.63916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204156.64054: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204156.64139: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204156.64282: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204156.66213: stdout chunk (state=3): >>>/root <<< 15980 1727204156.66287: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204156.66373: stderr chunk (state=3): >>><<< 15980 1727204156.66424: stdout chunk (state=3): >>><<< 15980 1727204156.66569: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204156.66573: _low_level_execute_command(): starting 15980 1727204156.66577: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204156.6645973-17738-151117158857958 `" && echo ansible-tmp-1727204156.6645973-17738-151117158857958="` echo /root/.ansible/tmp/ansible-tmp-1727204156.6645973-17738-151117158857958 `" ) && sleep 0' 15980 1727204156.68080: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204156.68085: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204156.68531: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204156.68708: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204156.70710: stdout chunk (state=3): >>>ansible-tmp-1727204156.6645973-17738-151117158857958=/root/.ansible/tmp/ansible-tmp-1727204156.6645973-17738-151117158857958 <<< 15980 1727204156.71012: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204156.71101: stderr chunk (state=3): >>><<< 15980 1727204156.71105: stdout chunk (state=3): >>><<< 15980 1727204156.71108: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204156.6645973-17738-151117158857958=/root/.ansible/tmp/ansible-tmp-1727204156.6645973-17738-151117158857958 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204156.71378: variable 'ansible_module_compression' from source: unknown 15980 1727204156.71382: ANSIBALLZ: Using lock for ping 15980 1727204156.71384: ANSIBALLZ: Acquiring lock 15980 1727204156.71386: ANSIBALLZ: Lock acquired: 139981192771968 15980 1727204156.71388: ANSIBALLZ: Creating module 15980 1727204157.02130: ANSIBALLZ: Writing module into payload 15980 1727204157.02236: ANSIBALLZ: Writing module 15980 1727204157.02337: ANSIBALLZ: Renaming module 15980 1727204157.02351: ANSIBALLZ: Done creating module 15980 1727204157.02406: variable 'ansible_facts' from source: unknown 15980 1727204157.02533: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204156.6645973-17738-151117158857958/AnsiballZ_ping.py 15980 1727204157.02687: Sending initial data 15980 1727204157.02715: Sent initial data (153 bytes) 15980 1727204157.04626: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204157.04694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204157.04869: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204157.05013: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204157.05022: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204157.05186: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204157.05247: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204157.07126: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15980 1727204157.07157: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15980 1727204157.07235: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15980vtkmnsww/tmp825r0ut2 /root/.ansible/tmp/ansible-tmp-1727204156.6645973-17738-151117158857958/AnsiballZ_ping.py <<< 15980 1727204157.07239: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204156.6645973-17738-151117158857958/AnsiballZ_ping.py" <<< 15980 1727204157.07331: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15980vtkmnsww/tmp825r0ut2" to remote "/root/.ansible/tmp/ansible-tmp-1727204156.6645973-17738-151117158857958/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204156.6645973-17738-151117158857958/AnsiballZ_ping.py" <<< 15980 1727204157.08712: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204157.08879: stderr chunk (state=3): >>><<< 15980 1727204157.08883: stdout chunk (state=3): >>><<< 15980 1727204157.08885: done transferring module to remote 15980 1727204157.08888: _low_level_execute_command(): starting 15980 1727204157.08890: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204156.6645973-17738-151117158857958/ /root/.ansible/tmp/ansible-tmp-1727204156.6645973-17738-151117158857958/AnsiballZ_ping.py && sleep 0' 15980 1727204157.09525: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204157.09648: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204157.09655: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204157.09658: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204157.09704: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204157.09789: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204157.11672: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204157.11767: stderr chunk (state=3): >>><<< 15980 1727204157.11771: stdout chunk (state=3): >>><<< 15980 1727204157.11808: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204157.11812: _low_level_execute_command(): starting 15980 1727204157.11814: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204156.6645973-17738-151117158857958/AnsiballZ_ping.py && sleep 0' 15980 1727204157.12602: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204157.12606: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204157.12706: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15980 1727204157.12713: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 15980 1727204157.12717: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204157.12720: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204157.12722: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204157.12780: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204157.12869: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204157.29118: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 15980 1727204157.30433: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 15980 1727204157.30517: stderr chunk (state=3): >>><<< 15980 1727204157.30521: stdout chunk (state=3): >>><<< 15980 1727204157.30524: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 15980 1727204157.30549: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204156.6645973-17738-151117158857958/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15980 1727204157.30559: _low_level_execute_command(): starting 15980 1727204157.30565: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204156.6645973-17738-151117158857958/ > /dev/null 2>&1 && sleep 0' 15980 1727204157.31078: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204157.31082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204157.31087: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204157.31089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204157.31189: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204157.31276: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204157.42642: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204157.42704: stderr chunk (state=3): >>><<< 15980 1727204157.42708: stdout chunk (state=3): >>><<< 15980 1727204157.42721: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204157.42731: handler run complete 15980 1727204157.42745: attempt loop complete, returning result 15980 1727204157.42748: _execute() done 15980 1727204157.42778: dumping result to json 15980 1727204157.42781: done dumping result, returning 15980 1727204157.42788: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [127b8e07-fff9-5f1d-4b72-000000000029] 15980 1727204157.42793: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000029 15980 1727204157.42923: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000029 15980 1727204157.42926: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 15980 1727204157.43006: no more pending results, returning what we have 15980 1727204157.43010: results queue empty 15980 1727204157.43010: checking for any_errors_fatal 15980 1727204157.43018: done checking for any_errors_fatal 15980 1727204157.43019: checking for max_fail_percentage 15980 1727204157.43021: done checking for max_fail_percentage 15980 1727204157.43022: checking to see if all hosts have failed and the running result is not ok 15980 1727204157.43023: done checking to see if all hosts have failed 15980 1727204157.43024: getting the remaining hosts for this loop 15980 1727204157.43026: done getting the remaining hosts for this loop 15980 1727204157.43031: getting the next task for host managed-node2 15980 1727204157.43039: done getting next task for host managed-node2 15980 1727204157.43041: ^ task is: TASK: meta (role_complete) 15980 1727204157.43043: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204157.43055: getting variables 15980 1727204157.43057: in VariableManager get_vars() 15980 1727204157.43098: Calling all_inventory to load vars for managed-node2 15980 1727204157.43101: Calling groups_inventory to load vars for managed-node2 15980 1727204157.43104: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204157.43115: Calling all_plugins_play to load vars for managed-node2 15980 1727204157.43118: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204157.43121: Calling groups_plugins_play to load vars for managed-node2 15980 1727204157.44602: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204157.46762: done with get_vars() 15980 1727204157.46809: done getting variables 15980 1727204157.47016: done queuing things up, now waiting for results queue to drain 15980 1727204157.47019: results queue empty 15980 1727204157.47019: checking for any_errors_fatal 15980 1727204157.47023: done checking for any_errors_fatal 15980 1727204157.47024: checking for max_fail_percentage 15980 1727204157.47025: done checking for max_fail_percentage 15980 1727204157.47026: checking to see if all hosts have failed and the running result is not ok 15980 1727204157.47027: done checking to see if all hosts have failed 15980 1727204157.47027: getting the remaining hosts for this loop 15980 1727204157.47028: done getting the remaining hosts for this loop 15980 1727204157.47031: getting the next task for host managed-node2 15980 1727204157.47035: done getting next task for host managed-node2 15980 1727204157.47037: ^ task is: TASK: meta (flush_handlers) 15980 1727204157.47038: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204157.47041: getting variables 15980 1727204157.47042: in VariableManager get_vars() 15980 1727204157.47058: Calling all_inventory to load vars for managed-node2 15980 1727204157.47061: Calling groups_inventory to load vars for managed-node2 15980 1727204157.47063: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204157.47173: Calling all_plugins_play to load vars for managed-node2 15980 1727204157.47177: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204157.47181: Calling groups_plugins_play to load vars for managed-node2 15980 1727204157.49269: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204157.51803: done with get_vars() 15980 1727204157.51843: done getting variables 15980 1727204157.52318: in VariableManager get_vars() 15980 1727204157.52337: Calling all_inventory to load vars for managed-node2 15980 1727204157.52342: Calling groups_inventory to load vars for managed-node2 15980 1727204157.52345: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204157.52351: Calling all_plugins_play to load vars for managed-node2 15980 1727204157.52354: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204157.52356: Calling groups_plugins_play to load vars for managed-node2 15980 1727204157.54974: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204157.58025: done with get_vars() 15980 1727204157.58072: done queuing things up, now waiting for results queue to drain 15980 1727204157.58075: results queue empty 15980 1727204157.58076: checking for any_errors_fatal 15980 1727204157.58077: done checking for any_errors_fatal 15980 1727204157.58078: checking for max_fail_percentage 15980 1727204157.58079: done checking for max_fail_percentage 15980 1727204157.58080: checking to see if all hosts have failed and the running result is not ok 15980 1727204157.58081: done checking to see if all hosts have failed 15980 1727204157.58082: getting the remaining hosts for this loop 15980 1727204157.58083: done getting the remaining hosts for this loop 15980 1727204157.58086: getting the next task for host managed-node2 15980 1727204157.58090: done getting next task for host managed-node2 15980 1727204157.58092: ^ task is: TASK: meta (flush_handlers) 15980 1727204157.58094: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204157.58097: getting variables 15980 1727204157.58098: in VariableManager get_vars() 15980 1727204157.58113: Calling all_inventory to load vars for managed-node2 15980 1727204157.58115: Calling groups_inventory to load vars for managed-node2 15980 1727204157.58117: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204157.58124: Calling all_plugins_play to load vars for managed-node2 15980 1727204157.58126: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204157.58129: Calling groups_plugins_play to load vars for managed-node2 15980 1727204157.66117: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204157.68388: done with get_vars() 15980 1727204157.68427: done getting variables 15980 1727204157.68497: in VariableManager get_vars() 15980 1727204157.68512: Calling all_inventory to load vars for managed-node2 15980 1727204157.68514: Calling groups_inventory to load vars for managed-node2 15980 1727204157.68517: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204157.68522: Calling all_plugins_play to load vars for managed-node2 15980 1727204157.68524: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204157.68530: Calling groups_plugins_play to load vars for managed-node2 15980 1727204157.70106: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204157.72609: done with get_vars() 15980 1727204157.72654: done queuing things up, now waiting for results queue to drain 15980 1727204157.72656: results queue empty 15980 1727204157.72657: checking for any_errors_fatal 15980 1727204157.72659: done checking for any_errors_fatal 15980 1727204157.72659: checking for max_fail_percentage 15980 1727204157.72661: done checking for max_fail_percentage 15980 1727204157.72661: checking to see if all hosts have failed and the running result is not ok 15980 1727204157.72662: done checking to see if all hosts have failed 15980 1727204157.72663: getting the remaining hosts for this loop 15980 1727204157.72664: done getting the remaining hosts for this loop 15980 1727204157.72669: getting the next task for host managed-node2 15980 1727204157.72673: done getting next task for host managed-node2 15980 1727204157.72674: ^ task is: None 15980 1727204157.72675: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204157.72678: done queuing things up, now waiting for results queue to drain 15980 1727204157.72679: results queue empty 15980 1727204157.72680: checking for any_errors_fatal 15980 1727204157.72680: done checking for any_errors_fatal 15980 1727204157.72681: checking for max_fail_percentage 15980 1727204157.72682: done checking for max_fail_percentage 15980 1727204157.72683: checking to see if all hosts have failed and the running result is not ok 15980 1727204157.72684: done checking to see if all hosts have failed 15980 1727204157.72685: getting the next task for host managed-node2 15980 1727204157.72687: done getting next task for host managed-node2 15980 1727204157.72688: ^ task is: None 15980 1727204157.72689: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204157.72736: in VariableManager get_vars() 15980 1727204157.72756: done with get_vars() 15980 1727204157.72762: in VariableManager get_vars() 15980 1727204157.72776: done with get_vars() 15980 1727204157.72780: variable 'omit' from source: magic vars 15980 1727204157.72898: variable 'task' from source: play vars 15980 1727204157.72940: in VariableManager get_vars() 15980 1727204157.72953: done with get_vars() 15980 1727204157.72974: variable 'omit' from source: magic vars PLAY [Run the tasklist tasks/assert_device_present.yml] ************************ 15980 1727204157.73244: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15980 1727204157.73276: getting the remaining hosts for this loop 15980 1727204157.73278: done getting the remaining hosts for this loop 15980 1727204157.73280: getting the next task for host managed-node2 15980 1727204157.73283: done getting next task for host managed-node2 15980 1727204157.73286: ^ task is: TASK: Gathering Facts 15980 1727204157.73287: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204157.73290: getting variables 15980 1727204157.73291: in VariableManager get_vars() 15980 1727204157.73300: Calling all_inventory to load vars for managed-node2 15980 1727204157.73302: Calling groups_inventory to load vars for managed-node2 15980 1727204157.73305: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204157.73311: Calling all_plugins_play to load vars for managed-node2 15980 1727204157.73314: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204157.73317: Calling groups_plugins_play to load vars for managed-node2 15980 1727204157.75653: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204157.78716: done with get_vars() 15980 1727204157.78754: done getting variables 15980 1727204157.78814: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Tuesday 24 September 2024 14:55:57 -0400 (0:00:01.216) 0:00:19.198 ***** 15980 1727204157.78844: entering _queue_task() for managed-node2/gather_facts 15980 1727204157.79247: worker is 1 (out of 1 available) 15980 1727204157.79261: exiting _queue_task() for managed-node2/gather_facts 15980 1727204157.79275: done queuing things up, now waiting for results queue to drain 15980 1727204157.79277: waiting for pending results... 15980 1727204157.79585: running TaskExecutor() for managed-node2/TASK: Gathering Facts 15980 1727204157.79745: in run() - task 127b8e07-fff9-5f1d-4b72-000000000219 15980 1727204157.79750: variable 'ansible_search_path' from source: unknown 15980 1727204157.79851: calling self._execute() 15980 1727204157.79910: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204157.79924: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204157.79940: variable 'omit' from source: magic vars 15980 1727204157.80658: variable 'ansible_distribution_major_version' from source: facts 15980 1727204157.80686: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204157.80699: variable 'omit' from source: magic vars 15980 1727204157.80839: variable 'omit' from source: magic vars 15980 1727204157.80844: variable 'omit' from source: magic vars 15980 1727204157.80946: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204157.80950: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204157.80959: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204157.80977: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204157.80996: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204157.81035: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204157.81045: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204157.81058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204157.81185: Set connection var ansible_connection to ssh 15980 1727204157.81200: Set connection var ansible_pipelining to False 15980 1727204157.81211: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204157.81221: Set connection var ansible_timeout to 10 15980 1727204157.81231: Set connection var ansible_shell_type to sh 15980 1727204157.81240: Set connection var ansible_shell_executable to /bin/sh 15980 1727204157.81288: variable 'ansible_shell_executable' from source: unknown 15980 1727204157.81297: variable 'ansible_connection' from source: unknown 15980 1727204157.81304: variable 'ansible_module_compression' from source: unknown 15980 1727204157.81383: variable 'ansible_shell_type' from source: unknown 15980 1727204157.81386: variable 'ansible_shell_executable' from source: unknown 15980 1727204157.81395: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204157.81398: variable 'ansible_pipelining' from source: unknown 15980 1727204157.81400: variable 'ansible_timeout' from source: unknown 15980 1727204157.81402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204157.81560: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15980 1727204157.81581: variable 'omit' from source: magic vars 15980 1727204157.81591: starting attempt loop 15980 1727204157.81602: running the handler 15980 1727204157.81631: variable 'ansible_facts' from source: unknown 15980 1727204157.81654: _low_level_execute_command(): starting 15980 1727204157.81668: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15980 1727204157.82602: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204157.82643: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204157.82663: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204157.82712: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204157.82797: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204157.84599: stdout chunk (state=3): >>>/root <<< 15980 1727204157.84695: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204157.84788: stderr chunk (state=3): >>><<< 15980 1727204157.84792: stdout chunk (state=3): >>><<< 15980 1727204157.84816: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204157.84927: _low_level_execute_command(): starting 15980 1727204157.84931: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204157.8482416-17779-252551440727649 `" && echo ansible-tmp-1727204157.8482416-17779-252551440727649="` echo /root/.ansible/tmp/ansible-tmp-1727204157.8482416-17779-252551440727649 `" ) && sleep 0' 15980 1727204157.86438: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204157.86864: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204157.87333: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204157.87404: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204157.87596: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204157.90196: stdout chunk (state=3): >>>ansible-tmp-1727204157.8482416-17779-252551440727649=/root/.ansible/tmp/ansible-tmp-1727204157.8482416-17779-252551440727649 <<< 15980 1727204157.90201: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204157.90241: stderr chunk (state=3): >>><<< 15980 1727204157.90333: stdout chunk (state=3): >>><<< 15980 1727204157.90394: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204157.8482416-17779-252551440727649=/root/.ansible/tmp/ansible-tmp-1727204157.8482416-17779-252551440727649 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204157.90437: variable 'ansible_module_compression' from source: unknown 15980 1727204157.90729: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15980vtkmnsww/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15980 1727204157.91004: variable 'ansible_facts' from source: unknown 15980 1727204157.91605: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204157.8482416-17779-252551440727649/AnsiballZ_setup.py 15980 1727204157.91970: Sending initial data 15980 1727204157.91974: Sent initial data (154 bytes) 15980 1727204157.92652: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204157.92683: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204157.92718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15980 1727204157.92812: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204157.92833: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204157.92946: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204157.94787: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15980 1727204157.94897: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15980 1727204157.94961: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15980vtkmnsww/tmpgi2uflz5 /root/.ansible/tmp/ansible-tmp-1727204157.8482416-17779-252551440727649/AnsiballZ_setup.py <<< 15980 1727204157.94964: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204157.8482416-17779-252551440727649/AnsiballZ_setup.py" <<< 15980 1727204157.95059: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15980vtkmnsww/tmpgi2uflz5" to remote "/root/.ansible/tmp/ansible-tmp-1727204157.8482416-17779-252551440727649/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204157.8482416-17779-252551440727649/AnsiballZ_setup.py" <<< 15980 1727204157.97945: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204157.97950: stdout chunk (state=3): >>><<< 15980 1727204157.97952: stderr chunk (state=3): >>><<< 15980 1727204157.97993: done transferring module to remote 15980 1727204157.98047: _low_level_execute_command(): starting 15980 1727204157.98051: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204157.8482416-17779-252551440727649/ /root/.ansible/tmp/ansible-tmp-1727204157.8482416-17779-252551440727649/AnsiballZ_setup.py && sleep 0' 15980 1727204157.99294: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204157.99315: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204157.99339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204157.99362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15980 1727204157.99383: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 15980 1727204157.99395: stderr chunk (state=3): >>>debug2: match not found <<< 15980 1727204157.99408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204157.99424: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15980 1727204157.99451: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204157.99482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204157.99572: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204157.99620: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204157.99771: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204158.01909: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204158.01914: stdout chunk (state=3): >>><<< 15980 1727204158.01917: stderr chunk (state=3): >>><<< 15980 1727204158.02080: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204158.02090: _low_level_execute_command(): starting 15980 1727204158.02176: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204157.8482416-17779-252551440727649/AnsiballZ_setup.py && sleep 0' 15980 1727204158.03285: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204158.03299: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204158.03319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204158.03391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204158.03460: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204158.03503: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204158.03633: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204158.69275: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "55", "second": "58", "epoch": "1727204158", "epoch_int": "1727204158", "date": "2024-09-24", "time": "14:55:58", "iso8601_micro": "2024-09-24T18:55:58.337294Z", "iso8601": "2024-09-24T18:55:58Z", "iso8601_basic": "20240924T145558337294", "iso8601_basic_short": "20240924T145558", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_is_chroot": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_loadavg": {"1m": 0.7578125, "5m": 0.51953125, "15m": 0.25439453125}, "ansible_processor": ["0", "GenuineIntel"<<< 15980 1727204158.69303: stdout chunk (state=3): >>>, "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3056, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 660, "free": 3056}, "nocache": {"free": 3485, "used": 231}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_uuid": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 504, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251325722624, "block_size": 4096, "block_total": 64479564, "block_available": 61358819, "block_used": 3120745, "inode_total": 16384000, "inode_available": 16301509, "inode_used": 82491, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_fibre_channel_wwn": [], "ansible_iscsi_iqn": "", "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_fips": false, "ansible_interfaces": ["LSR-TST-br31", "lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::f7:13ff:fe22:8fc1", "prefix": "64", "scope": "link"}]}, "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "42:c8:ed:20:3e:1e", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.73"], "ansible_all_ipv6_addresses": ["fe80::f7:13ff:fe22:8fc1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.73", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::f7:13ff:fe22:8fc1"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15980 1727204158.71390: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 15980 1727204158.71394: stdout chunk (state=3): >>><<< 15980 1727204158.71396: stderr chunk (state=3): >>><<< 15980 1727204158.71430: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "55", "second": "58", "epoch": "1727204158", "epoch_int": "1727204158", "date": "2024-09-24", "time": "14:55:58", "iso8601_micro": "2024-09-24T18:55:58.337294Z", "iso8601": "2024-09-24T18:55:58Z", "iso8601_basic": "20240924T145558337294", "iso8601_basic_short": "20240924T145558", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_is_chroot": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_loadavg": {"1m": 0.7578125, "5m": 0.51953125, "15m": 0.25439453125}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3056, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 660, "free": 3056}, "nocache": {"free": 3485, "used": 231}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_uuid": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 504, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251325722624, "block_size": 4096, "block_total": 64479564, "block_available": 61358819, "block_used": 3120745, "inode_total": 16384000, "inode_available": 16301509, "inode_used": 82491, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_fibre_channel_wwn": [], "ansible_iscsi_iqn": "", "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_fips": false, "ansible_interfaces": ["LSR-TST-br31", "lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::f7:13ff:fe22:8fc1", "prefix": "64", "scope": "link"}]}, "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "42:c8:ed:20:3e:1e", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.73"], "ansible_all_ipv6_addresses": ["fe80::f7:13ff:fe22:8fc1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.73", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::f7:13ff:fe22:8fc1"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 15980 1727204158.72060: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204157.8482416-17779-252551440727649/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15980 1727204158.72064: _low_level_execute_command(): starting 15980 1727204158.72069: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204157.8482416-17779-252551440727649/ > /dev/null 2>&1 && sleep 0' 15980 1727204158.72740: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204158.72756: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204158.72775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204158.72835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204158.72904: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204158.72944: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204158.72971: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204158.73079: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204158.75086: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204158.75120: stderr chunk (state=3): >>><<< 15980 1727204158.75124: stdout chunk (state=3): >>><<< 15980 1727204158.75272: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204158.75276: handler run complete 15980 1727204158.75320: variable 'ansible_facts' from source: unknown 15980 1727204158.75447: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204158.75842: variable 'ansible_facts' from source: unknown 15980 1727204158.75954: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204158.76111: attempt loop complete, returning result 15980 1727204158.76121: _execute() done 15980 1727204158.76132: dumping result to json 15980 1727204158.76267: done dumping result, returning 15980 1727204158.76272: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [127b8e07-fff9-5f1d-4b72-000000000219] 15980 1727204158.76275: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000219 ok: [managed-node2] 15980 1727204158.77534: no more pending results, returning what we have 15980 1727204158.77538: results queue empty 15980 1727204158.77539: checking for any_errors_fatal 15980 1727204158.77540: done checking for any_errors_fatal 15980 1727204158.77541: checking for max_fail_percentage 15980 1727204158.77550: done checking for max_fail_percentage 15980 1727204158.77553: checking to see if all hosts have failed and the running result is not ok 15980 1727204158.77554: done checking to see if all hosts have failed 15980 1727204158.77555: getting the remaining hosts for this loop 15980 1727204158.77556: done getting the remaining hosts for this loop 15980 1727204158.77560: getting the next task for host managed-node2 15980 1727204158.77567: done getting next task for host managed-node2 15980 1727204158.77569: ^ task is: TASK: meta (flush_handlers) 15980 1727204158.77571: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204158.77574: getting variables 15980 1727204158.77576: in VariableManager get_vars() 15980 1727204158.77599: Calling all_inventory to load vars for managed-node2 15980 1727204158.77601: Calling groups_inventory to load vars for managed-node2 15980 1727204158.77605: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204158.77614: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000219 15980 1727204158.77617: WORKER PROCESS EXITING 15980 1727204158.77630: Calling all_plugins_play to load vars for managed-node2 15980 1727204158.77634: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204158.77637: Calling groups_plugins_play to load vars for managed-node2 15980 1727204158.79407: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204158.81791: done with get_vars() 15980 1727204158.81832: done getting variables 15980 1727204158.81930: in VariableManager get_vars() 15980 1727204158.81941: Calling all_inventory to load vars for managed-node2 15980 1727204158.81944: Calling groups_inventory to load vars for managed-node2 15980 1727204158.81947: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204158.81953: Calling all_plugins_play to load vars for managed-node2 15980 1727204158.81955: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204158.81958: Calling groups_plugins_play to load vars for managed-node2 15980 1727204158.83769: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204158.86132: done with get_vars() 15980 1727204158.86187: done queuing things up, now waiting for results queue to drain 15980 1727204158.86189: results queue empty 15980 1727204158.86190: checking for any_errors_fatal 15980 1727204158.86195: done checking for any_errors_fatal 15980 1727204158.86196: checking for max_fail_percentage 15980 1727204158.86197: done checking for max_fail_percentage 15980 1727204158.86203: checking to see if all hosts have failed and the running result is not ok 15980 1727204158.86204: done checking to see if all hosts have failed 15980 1727204158.86205: getting the remaining hosts for this loop 15980 1727204158.86206: done getting the remaining hosts for this loop 15980 1727204158.86209: getting the next task for host managed-node2 15980 1727204158.86213: done getting next task for host managed-node2 15980 1727204158.86216: ^ task is: TASK: Include the task '{{ task }}' 15980 1727204158.86217: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204158.86219: getting variables 15980 1727204158.86221: in VariableManager get_vars() 15980 1727204158.86233: Calling all_inventory to load vars for managed-node2 15980 1727204158.86236: Calling groups_inventory to load vars for managed-node2 15980 1727204158.86238: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204158.86244: Calling all_plugins_play to load vars for managed-node2 15980 1727204158.86247: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204158.86250: Calling groups_plugins_play to load vars for managed-node2 15980 1727204158.87898: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204158.90330: done with get_vars() 15980 1727204158.90360: done getting variables 15980 1727204158.90578: variable 'task' from source: play vars TASK [Include the task 'tasks/assert_device_present.yml'] ********************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:6 Tuesday 24 September 2024 14:55:58 -0400 (0:00:01.117) 0:00:20.316 ***** 15980 1727204158.90610: entering _queue_task() for managed-node2/include_tasks 15980 1727204158.91051: worker is 1 (out of 1 available) 15980 1727204158.91064: exiting _queue_task() for managed-node2/include_tasks 15980 1727204158.91185: done queuing things up, now waiting for results queue to drain 15980 1727204158.91187: waiting for pending results... 15980 1727204158.91418: running TaskExecutor() for managed-node2/TASK: Include the task 'tasks/assert_device_present.yml' 15980 1727204158.91563: in run() - task 127b8e07-fff9-5f1d-4b72-00000000002d 15980 1727204158.91615: variable 'ansible_search_path' from source: unknown 15980 1727204158.91648: calling self._execute() 15980 1727204158.91764: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204158.91834: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204158.91838: variable 'omit' from source: magic vars 15980 1727204158.92258: variable 'ansible_distribution_major_version' from source: facts 15980 1727204158.92289: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204158.92303: variable 'task' from source: play vars 15980 1727204158.92399: variable 'task' from source: play vars 15980 1727204158.92418: _execute() done 15980 1727204158.92430: dumping result to json 15980 1727204158.92471: done dumping result, returning 15980 1727204158.92475: done running TaskExecutor() for managed-node2/TASK: Include the task 'tasks/assert_device_present.yml' [127b8e07-fff9-5f1d-4b72-00000000002d] 15980 1727204158.92482: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000002d 15980 1727204158.92640: no more pending results, returning what we have 15980 1727204158.92647: in VariableManager get_vars() 15980 1727204158.92890: Calling all_inventory to load vars for managed-node2 15980 1727204158.92894: Calling groups_inventory to load vars for managed-node2 15980 1727204158.92899: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204158.92916: Calling all_plugins_play to load vars for managed-node2 15980 1727204158.92920: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204158.92923: Calling groups_plugins_play to load vars for managed-node2 15980 1727204158.93548: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000002d 15980 1727204158.93552: WORKER PROCESS EXITING 15980 1727204158.94961: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204158.97315: done with get_vars() 15980 1727204158.97361: variable 'ansible_search_path' from source: unknown 15980 1727204158.97383: we have included files to process 15980 1727204158.97384: generating all_blocks data 15980 1727204158.97386: done generating all_blocks data 15980 1727204158.97387: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 15980 1727204158.97388: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 15980 1727204158.97391: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 15980 1727204158.97593: in VariableManager get_vars() 15980 1727204158.97605: done with get_vars() 15980 1727204158.97704: done processing included file 15980 1727204158.97705: iterating over new_blocks loaded from include file 15980 1727204158.97706: in VariableManager get_vars() 15980 1727204158.97715: done with get_vars() 15980 1727204158.97716: filtering new block on tags 15980 1727204158.97733: done filtering new block on tags 15980 1727204158.97736: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed-node2 15980 1727204158.97740: extending task lists for all hosts with included blocks 15980 1727204158.97769: done extending task lists 15980 1727204158.97770: done processing included files 15980 1727204158.97770: results queue empty 15980 1727204158.97771: checking for any_errors_fatal 15980 1727204158.97772: done checking for any_errors_fatal 15980 1727204158.97773: checking for max_fail_percentage 15980 1727204158.97774: done checking for max_fail_percentage 15980 1727204158.97774: checking to see if all hosts have failed and the running result is not ok 15980 1727204158.97775: done checking to see if all hosts have failed 15980 1727204158.97775: getting the remaining hosts for this loop 15980 1727204158.97776: done getting the remaining hosts for this loop 15980 1727204158.97778: getting the next task for host managed-node2 15980 1727204158.97780: done getting next task for host managed-node2 15980 1727204158.97782: ^ task is: TASK: Include the task 'get_interface_stat.yml' 15980 1727204158.97784: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204158.97785: getting variables 15980 1727204158.97786: in VariableManager get_vars() 15980 1727204158.97793: Calling all_inventory to load vars for managed-node2 15980 1727204158.97794: Calling groups_inventory to load vars for managed-node2 15980 1727204158.97796: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204158.97800: Calling all_plugins_play to load vars for managed-node2 15980 1727204158.97802: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204158.97804: Calling groups_plugins_play to load vars for managed-node2 15980 1727204158.98746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204159.00446: done with get_vars() 15980 1727204159.00485: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:55:59 -0400 (0:00:00.099) 0:00:20.416 ***** 15980 1727204159.00594: entering _queue_task() for managed-node2/include_tasks 15980 1727204159.00901: worker is 1 (out of 1 available) 15980 1727204159.00916: exiting _queue_task() for managed-node2/include_tasks 15980 1727204159.00930: done queuing things up, now waiting for results queue to drain 15980 1727204159.00932: waiting for pending results... 15980 1727204159.01120: running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' 15980 1727204159.01212: in run() - task 127b8e07-fff9-5f1d-4b72-00000000022a 15980 1727204159.01222: variable 'ansible_search_path' from source: unknown 15980 1727204159.01228: variable 'ansible_search_path' from source: unknown 15980 1727204159.01259: calling self._execute() 15980 1727204159.01335: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204159.01339: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204159.01349: variable 'omit' from source: magic vars 15980 1727204159.01663: variable 'ansible_distribution_major_version' from source: facts 15980 1727204159.01675: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204159.01682: _execute() done 15980 1727204159.01685: dumping result to json 15980 1727204159.01688: done dumping result, returning 15980 1727204159.01695: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' [127b8e07-fff9-5f1d-4b72-00000000022a] 15980 1727204159.01700: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000022a 15980 1727204159.01801: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000022a 15980 1727204159.01807: WORKER PROCESS EXITING 15980 1727204159.01840: no more pending results, returning what we have 15980 1727204159.01846: in VariableManager get_vars() 15980 1727204159.01886: Calling all_inventory to load vars for managed-node2 15980 1727204159.01889: Calling groups_inventory to load vars for managed-node2 15980 1727204159.01892: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204159.01907: Calling all_plugins_play to load vars for managed-node2 15980 1727204159.01909: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204159.01912: Calling groups_plugins_play to load vars for managed-node2 15980 1727204159.03355: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204159.05971: done with get_vars() 15980 1727204159.06003: variable 'ansible_search_path' from source: unknown 15980 1727204159.06005: variable 'ansible_search_path' from source: unknown 15980 1727204159.06020: variable 'task' from source: play vars 15980 1727204159.06169: variable 'task' from source: play vars 15980 1727204159.06216: we have included files to process 15980 1727204159.06218: generating all_blocks data 15980 1727204159.06219: done generating all_blocks data 15980 1727204159.06221: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15980 1727204159.06222: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15980 1727204159.06224: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15980 1727204159.06518: done processing included file 15980 1727204159.06520: iterating over new_blocks loaded from include file 15980 1727204159.06522: in VariableManager get_vars() 15980 1727204159.06540: done with get_vars() 15980 1727204159.06542: filtering new block on tags 15980 1727204159.06560: done filtering new block on tags 15980 1727204159.06562: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node2 15980 1727204159.06571: extending task lists for all hosts with included blocks 15980 1727204159.06694: done extending task lists 15980 1727204159.06695: done processing included files 15980 1727204159.06696: results queue empty 15980 1727204159.06697: checking for any_errors_fatal 15980 1727204159.06700: done checking for any_errors_fatal 15980 1727204159.06701: checking for max_fail_percentage 15980 1727204159.06702: done checking for max_fail_percentage 15980 1727204159.06703: checking to see if all hosts have failed and the running result is not ok 15980 1727204159.06704: done checking to see if all hosts have failed 15980 1727204159.06705: getting the remaining hosts for this loop 15980 1727204159.06706: done getting the remaining hosts for this loop 15980 1727204159.06709: getting the next task for host managed-node2 15980 1727204159.06713: done getting next task for host managed-node2 15980 1727204159.06716: ^ task is: TASK: Get stat for interface {{ interface }} 15980 1727204159.06719: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204159.06721: getting variables 15980 1727204159.06735: in VariableManager get_vars() 15980 1727204159.06746: Calling all_inventory to load vars for managed-node2 15980 1727204159.06749: Calling groups_inventory to load vars for managed-node2 15980 1727204159.06751: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204159.06757: Calling all_plugins_play to load vars for managed-node2 15980 1727204159.06759: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204159.06762: Calling groups_plugins_play to load vars for managed-node2 15980 1727204159.08533: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204159.12816: done with get_vars() 15980 1727204159.12857: done getting variables 15980 1727204159.13018: variable 'interface' from source: set_fact TASK [Get stat for interface LSR-TST-br31] ************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:55:59 -0400 (0:00:00.124) 0:00:20.540 ***** 15980 1727204159.13059: entering _queue_task() for managed-node2/stat 15980 1727204159.13549: worker is 1 (out of 1 available) 15980 1727204159.13564: exiting _queue_task() for managed-node2/stat 15980 1727204159.13643: done queuing things up, now waiting for results queue to drain 15980 1727204159.13645: waiting for pending results... 15980 1727204159.13931: running TaskExecutor() for managed-node2/TASK: Get stat for interface LSR-TST-br31 15980 1727204159.14073: in run() - task 127b8e07-fff9-5f1d-4b72-000000000235 15980 1727204159.14134: variable 'ansible_search_path' from source: unknown 15980 1727204159.14137: variable 'ansible_search_path' from source: unknown 15980 1727204159.14158: calling self._execute() 15980 1727204159.14275: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204159.14292: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204159.14349: variable 'omit' from source: magic vars 15980 1727204159.14794: variable 'ansible_distribution_major_version' from source: facts 15980 1727204159.14819: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204159.14840: variable 'omit' from source: magic vars 15980 1727204159.14900: variable 'omit' from source: magic vars 15980 1727204159.15032: variable 'interface' from source: set_fact 15980 1727204159.15112: variable 'omit' from source: magic vars 15980 1727204159.15116: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204159.15170: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204159.15198: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204159.15235: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204159.15259: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204159.15304: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204159.15314: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204159.15377: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204159.15471: Set connection var ansible_connection to ssh 15980 1727204159.15495: Set connection var ansible_pipelining to False 15980 1727204159.15509: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204159.15520: Set connection var ansible_timeout to 10 15980 1727204159.15535: Set connection var ansible_shell_type to sh 15980 1727204159.15571: Set connection var ansible_shell_executable to /bin/sh 15980 1727204159.15600: variable 'ansible_shell_executable' from source: unknown 15980 1727204159.15615: variable 'ansible_connection' from source: unknown 15980 1727204159.15655: variable 'ansible_module_compression' from source: unknown 15980 1727204159.15659: variable 'ansible_shell_type' from source: unknown 15980 1727204159.15661: variable 'ansible_shell_executable' from source: unknown 15980 1727204159.15664: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204159.15668: variable 'ansible_pipelining' from source: unknown 15980 1727204159.15671: variable 'ansible_timeout' from source: unknown 15980 1727204159.15673: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204159.15972: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15980 1727204159.15982: variable 'omit' from source: magic vars 15980 1727204159.15986: starting attempt loop 15980 1727204159.15989: running the handler 15980 1727204159.16009: _low_level_execute_command(): starting 15980 1727204159.16031: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15980 1727204159.16970: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204159.16994: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204159.17034: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204159.17053: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204159.17093: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204159.17222: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204159.19004: stdout chunk (state=3): >>>/root <<< 15980 1727204159.19237: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204159.19241: stdout chunk (state=3): >>><<< 15980 1727204159.19244: stderr chunk (state=3): >>><<< 15980 1727204159.19373: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204159.19378: _low_level_execute_command(): starting 15980 1727204159.19381: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204159.192695-17823-25767017417169 `" && echo ansible-tmp-1727204159.192695-17823-25767017417169="` echo /root/.ansible/tmp/ansible-tmp-1727204159.192695-17823-25767017417169 `" ) && sleep 0' 15980 1727204159.20075: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204159.20097: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204159.20203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204159.20273: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204159.20301: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204159.20333: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204159.20447: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204159.22439: stdout chunk (state=3): >>>ansible-tmp-1727204159.192695-17823-25767017417169=/root/.ansible/tmp/ansible-tmp-1727204159.192695-17823-25767017417169 <<< 15980 1727204159.22661: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204159.22667: stdout chunk (state=3): >>><<< 15980 1727204159.22670: stderr chunk (state=3): >>><<< 15980 1727204159.22690: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204159.192695-17823-25767017417169=/root/.ansible/tmp/ansible-tmp-1727204159.192695-17823-25767017417169 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204159.22769: variable 'ansible_module_compression' from source: unknown 15980 1727204159.22873: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15980vtkmnsww/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 15980 1727204159.22901: variable 'ansible_facts' from source: unknown 15980 1727204159.23023: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204159.192695-17823-25767017417169/AnsiballZ_stat.py 15980 1727204159.23329: Sending initial data 15980 1727204159.23332: Sent initial data (151 bytes) 15980 1727204159.24020: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204159.24078: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204159.24110: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204159.24136: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204159.24244: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204159.25863: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 15980 1727204159.25907: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15980 1727204159.25975: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15980 1727204159.26058: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15980vtkmnsww/tmp11yv366r /root/.ansible/tmp/ansible-tmp-1727204159.192695-17823-25767017417169/AnsiballZ_stat.py <<< 15980 1727204159.26061: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204159.192695-17823-25767017417169/AnsiballZ_stat.py" <<< 15980 1727204159.26127: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15980vtkmnsww/tmp11yv366r" to remote "/root/.ansible/tmp/ansible-tmp-1727204159.192695-17823-25767017417169/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204159.192695-17823-25767017417169/AnsiballZ_stat.py" <<< 15980 1727204159.27002: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204159.27099: stderr chunk (state=3): >>><<< 15980 1727204159.27102: stdout chunk (state=3): >>><<< 15980 1727204159.27205: done transferring module to remote 15980 1727204159.27209: _low_level_execute_command(): starting 15980 1727204159.27214: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204159.192695-17823-25767017417169/ /root/.ansible/tmp/ansible-tmp-1727204159.192695-17823-25767017417169/AnsiballZ_stat.py && sleep 0' 15980 1727204159.27815: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204159.27824: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204159.27840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204159.27860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15980 1727204159.27874: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 15980 1727204159.27877: stderr chunk (state=3): >>>debug2: match not found <<< 15980 1727204159.27887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204159.27902: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15980 1727204159.27910: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 15980 1727204159.27977: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15980 1727204159.27980: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204159.27984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204159.27986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15980 1727204159.27988: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 15980 1727204159.27990: stderr chunk (state=3): >>>debug2: match found <<< 15980 1727204159.27998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204159.28070: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204159.28074: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204159.28076: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204159.28184: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204159.30274: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204159.30279: stderr chunk (state=3): >>><<< 15980 1727204159.30281: stdout chunk (state=3): >>><<< 15980 1727204159.30285: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204159.30288: _low_level_execute_command(): starting 15980 1727204159.30290: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204159.192695-17823-25767017417169/AnsiballZ_stat.py && sleep 0' 15980 1727204159.30891: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204159.30900: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204159.30960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204159.31032: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204159.31047: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204159.31079: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204159.31209: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204159.47895: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/LSR-TST-br31", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 35354, "dev": 23, "nlink": 1, "atime": 1727204156.0180304, "mtime": 1727204156.0180304, "ctime": 1727204156.0180304, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/LSR-TST-br31", "lnk_target": "../../devices/virtual/net/LSR-TST-br31", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 15980 1727204159.49207: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 15980 1727204159.49270: stderr chunk (state=3): >>><<< 15980 1727204159.49274: stdout chunk (state=3): >>><<< 15980 1727204159.49291: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/LSR-TST-br31", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 35354, "dev": 23, "nlink": 1, "atime": 1727204156.0180304, "mtime": 1727204156.0180304, "ctime": 1727204156.0180304, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/LSR-TST-br31", "lnk_target": "../../devices/virtual/net/LSR-TST-br31", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 15980 1727204159.49352: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204159.192695-17823-25767017417169/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15980 1727204159.49361: _low_level_execute_command(): starting 15980 1727204159.49368: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204159.192695-17823-25767017417169/ > /dev/null 2>&1 && sleep 0' 15980 1727204159.49993: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15980 1727204159.49998: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204159.50023: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204159.50082: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204159.52003: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204159.52063: stderr chunk (state=3): >>><<< 15980 1727204159.52069: stdout chunk (state=3): >>><<< 15980 1727204159.52085: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204159.52091: handler run complete 15980 1727204159.52130: attempt loop complete, returning result 15980 1727204159.52134: _execute() done 15980 1727204159.52136: dumping result to json 15980 1727204159.52143: done dumping result, returning 15980 1727204159.52150: done running TaskExecutor() for managed-node2/TASK: Get stat for interface LSR-TST-br31 [127b8e07-fff9-5f1d-4b72-000000000235] 15980 1727204159.52154: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000235 15980 1727204159.52276: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000235 15980 1727204159.52279: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "atime": 1727204156.0180304, "block_size": 4096, "blocks": 0, "ctime": 1727204156.0180304, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 35354, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/LSR-TST-br31", "lnk_target": "../../devices/virtual/net/LSR-TST-br31", "mode": "0777", "mtime": 1727204156.0180304, "nlink": 1, "path": "/sys/class/net/LSR-TST-br31", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 15980 1727204159.52389: no more pending results, returning what we have 15980 1727204159.52393: results queue empty 15980 1727204159.52394: checking for any_errors_fatal 15980 1727204159.52395: done checking for any_errors_fatal 15980 1727204159.52396: checking for max_fail_percentage 15980 1727204159.52397: done checking for max_fail_percentage 15980 1727204159.52398: checking to see if all hosts have failed and the running result is not ok 15980 1727204159.52399: done checking to see if all hosts have failed 15980 1727204159.52400: getting the remaining hosts for this loop 15980 1727204159.52402: done getting the remaining hosts for this loop 15980 1727204159.52406: getting the next task for host managed-node2 15980 1727204159.52415: done getting next task for host managed-node2 15980 1727204159.52417: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 15980 1727204159.52420: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204159.52424: getting variables 15980 1727204159.52428: in VariableManager get_vars() 15980 1727204159.52455: Calling all_inventory to load vars for managed-node2 15980 1727204159.52458: Calling groups_inventory to load vars for managed-node2 15980 1727204159.52461: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204159.52474: Calling all_plugins_play to load vars for managed-node2 15980 1727204159.52487: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204159.52491: Calling groups_plugins_play to load vars for managed-node2 15980 1727204159.54094: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204159.55746: done with get_vars() 15980 1727204159.55785: done getting variables 15980 1727204159.55836: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15980 1727204159.55946: variable 'interface' from source: set_fact TASK [Assert that the interface is present - 'LSR-TST-br31'] ******************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:55:59 -0400 (0:00:00.429) 0:00:20.970 ***** 15980 1727204159.55973: entering _queue_task() for managed-node2/assert 15980 1727204159.56273: worker is 1 (out of 1 available) 15980 1727204159.56288: exiting _queue_task() for managed-node2/assert 15980 1727204159.56300: done queuing things up, now waiting for results queue to drain 15980 1727204159.56302: waiting for pending results... 15980 1727204159.56504: running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'LSR-TST-br31' 15980 1727204159.56575: in run() - task 127b8e07-fff9-5f1d-4b72-00000000022b 15980 1727204159.56587: variable 'ansible_search_path' from source: unknown 15980 1727204159.56591: variable 'ansible_search_path' from source: unknown 15980 1727204159.56625: calling self._execute() 15980 1727204159.56717: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204159.56722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204159.56735: variable 'omit' from source: magic vars 15980 1727204159.57040: variable 'ansible_distribution_major_version' from source: facts 15980 1727204159.57049: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204159.57057: variable 'omit' from source: magic vars 15980 1727204159.57093: variable 'omit' from source: magic vars 15980 1727204159.57175: variable 'interface' from source: set_fact 15980 1727204159.57189: variable 'omit' from source: magic vars 15980 1727204159.57225: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204159.57258: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204159.57302: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204159.57306: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204159.57332: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204159.57360: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204159.57363: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204159.57366: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204159.57455: Set connection var ansible_connection to ssh 15980 1727204159.57462: Set connection var ansible_pipelining to False 15980 1727204159.57470: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204159.57476: Set connection var ansible_timeout to 10 15980 1727204159.57482: Set connection var ansible_shell_type to sh 15980 1727204159.57487: Set connection var ansible_shell_executable to /bin/sh 15980 1727204159.57539: variable 'ansible_shell_executable' from source: unknown 15980 1727204159.57544: variable 'ansible_connection' from source: unknown 15980 1727204159.57547: variable 'ansible_module_compression' from source: unknown 15980 1727204159.57549: variable 'ansible_shell_type' from source: unknown 15980 1727204159.57552: variable 'ansible_shell_executable' from source: unknown 15980 1727204159.57554: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204159.57556: variable 'ansible_pipelining' from source: unknown 15980 1727204159.57558: variable 'ansible_timeout' from source: unknown 15980 1727204159.57561: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204159.57709: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15980 1727204159.57728: variable 'omit' from source: magic vars 15980 1727204159.57731: starting attempt loop 15980 1727204159.57735: running the handler 15980 1727204159.57854: variable 'interface_stat' from source: set_fact 15980 1727204159.57863: Evaluated conditional (interface_stat.stat.exists): True 15980 1727204159.57874: handler run complete 15980 1727204159.57885: attempt loop complete, returning result 15980 1727204159.57888: _execute() done 15980 1727204159.57891: dumping result to json 15980 1727204159.57893: done dumping result, returning 15980 1727204159.57900: done running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'LSR-TST-br31' [127b8e07-fff9-5f1d-4b72-00000000022b] 15980 1727204159.57905: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000022b 15980 1727204159.58008: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000022b 15980 1727204159.58011: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 15980 1727204159.58075: no more pending results, returning what we have 15980 1727204159.58078: results queue empty 15980 1727204159.58079: checking for any_errors_fatal 15980 1727204159.58089: done checking for any_errors_fatal 15980 1727204159.58090: checking for max_fail_percentage 15980 1727204159.58091: done checking for max_fail_percentage 15980 1727204159.58092: checking to see if all hosts have failed and the running result is not ok 15980 1727204159.58093: done checking to see if all hosts have failed 15980 1727204159.58094: getting the remaining hosts for this loop 15980 1727204159.58096: done getting the remaining hosts for this loop 15980 1727204159.58100: getting the next task for host managed-node2 15980 1727204159.58108: done getting next task for host managed-node2 15980 1727204159.58110: ^ task is: TASK: meta (flush_handlers) 15980 1727204159.58112: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204159.58116: getting variables 15980 1727204159.58117: in VariableManager get_vars() 15980 1727204159.58151: Calling all_inventory to load vars for managed-node2 15980 1727204159.58154: Calling groups_inventory to load vars for managed-node2 15980 1727204159.58158: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204159.58171: Calling all_plugins_play to load vars for managed-node2 15980 1727204159.58174: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204159.58177: Calling groups_plugins_play to load vars for managed-node2 15980 1727204159.59287: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204159.60800: done with get_vars() 15980 1727204159.60831: done getting variables 15980 1727204159.60918: in VariableManager get_vars() 15980 1727204159.60926: Calling all_inventory to load vars for managed-node2 15980 1727204159.60929: Calling groups_inventory to load vars for managed-node2 15980 1727204159.60931: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204159.60937: Calling all_plugins_play to load vars for managed-node2 15980 1727204159.60940: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204159.60943: Calling groups_plugins_play to load vars for managed-node2 15980 1727204159.62074: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204159.63495: done with get_vars() 15980 1727204159.63531: done queuing things up, now waiting for results queue to drain 15980 1727204159.63533: results queue empty 15980 1727204159.63533: checking for any_errors_fatal 15980 1727204159.63536: done checking for any_errors_fatal 15980 1727204159.63536: checking for max_fail_percentage 15980 1727204159.63537: done checking for max_fail_percentage 15980 1727204159.63538: checking to see if all hosts have failed and the running result is not ok 15980 1727204159.63539: done checking to see if all hosts have failed 15980 1727204159.63546: getting the remaining hosts for this loop 15980 1727204159.63547: done getting the remaining hosts for this loop 15980 1727204159.63551: getting the next task for host managed-node2 15980 1727204159.63555: done getting next task for host managed-node2 15980 1727204159.63557: ^ task is: TASK: meta (flush_handlers) 15980 1727204159.63559: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204159.63562: getting variables 15980 1727204159.63562: in VariableManager get_vars() 15980 1727204159.63573: Calling all_inventory to load vars for managed-node2 15980 1727204159.63575: Calling groups_inventory to load vars for managed-node2 15980 1727204159.63577: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204159.63584: Calling all_plugins_play to load vars for managed-node2 15980 1727204159.63587: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204159.63592: Calling groups_plugins_play to load vars for managed-node2 15980 1727204159.64963: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204159.66334: done with get_vars() 15980 1727204159.66361: done getting variables 15980 1727204159.66405: in VariableManager get_vars() 15980 1727204159.66415: Calling all_inventory to load vars for managed-node2 15980 1727204159.66416: Calling groups_inventory to load vars for managed-node2 15980 1727204159.66418: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204159.66422: Calling all_plugins_play to load vars for managed-node2 15980 1727204159.66424: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204159.66426: Calling groups_plugins_play to load vars for managed-node2 15980 1727204159.67558: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204159.69176: done with get_vars() 15980 1727204159.69202: done queuing things up, now waiting for results queue to drain 15980 1727204159.69204: results queue empty 15980 1727204159.69204: checking for any_errors_fatal 15980 1727204159.69205: done checking for any_errors_fatal 15980 1727204159.69206: checking for max_fail_percentage 15980 1727204159.69206: done checking for max_fail_percentage 15980 1727204159.69207: checking to see if all hosts have failed and the running result is not ok 15980 1727204159.69208: done checking to see if all hosts have failed 15980 1727204159.69209: getting the remaining hosts for this loop 15980 1727204159.69210: done getting the remaining hosts for this loop 15980 1727204159.69212: getting the next task for host managed-node2 15980 1727204159.69214: done getting next task for host managed-node2 15980 1727204159.69215: ^ task is: None 15980 1727204159.69216: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204159.69217: done queuing things up, now waiting for results queue to drain 15980 1727204159.69217: results queue empty 15980 1727204159.69218: checking for any_errors_fatal 15980 1727204159.69218: done checking for any_errors_fatal 15980 1727204159.69219: checking for max_fail_percentage 15980 1727204159.69219: done checking for max_fail_percentage 15980 1727204159.69220: checking to see if all hosts have failed and the running result is not ok 15980 1727204159.69220: done checking to see if all hosts have failed 15980 1727204159.69221: getting the next task for host managed-node2 15980 1727204159.69223: done getting next task for host managed-node2 15980 1727204159.69224: ^ task is: None 15980 1727204159.69225: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204159.69261: in VariableManager get_vars() 15980 1727204159.69280: done with get_vars() 15980 1727204159.69287: in VariableManager get_vars() 15980 1727204159.69301: done with get_vars() 15980 1727204159.69306: variable 'omit' from source: magic vars 15980 1727204159.69431: variable 'task' from source: play vars 15980 1727204159.69462: in VariableManager get_vars() 15980 1727204159.69475: done with get_vars() 15980 1727204159.69495: variable 'omit' from source: magic vars PLAY [Run the tasklist tasks/assert_profile_present.yml] *********************** 15980 1727204159.69699: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15980 1727204159.69727: getting the remaining hosts for this loop 15980 1727204159.69729: done getting the remaining hosts for this loop 15980 1727204159.69732: getting the next task for host managed-node2 15980 1727204159.69735: done getting next task for host managed-node2 15980 1727204159.69737: ^ task is: TASK: Gathering Facts 15980 1727204159.69739: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204159.69741: getting variables 15980 1727204159.69742: in VariableManager get_vars() 15980 1727204159.69751: Calling all_inventory to load vars for managed-node2 15980 1727204159.69753: Calling groups_inventory to load vars for managed-node2 15980 1727204159.69756: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204159.69760: Calling all_plugins_play to load vars for managed-node2 15980 1727204159.69762: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204159.69764: Calling groups_plugins_play to load vars for managed-node2 15980 1727204159.70778: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204159.72051: done with get_vars() 15980 1727204159.72081: done getting variables 15980 1727204159.72119: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Tuesday 24 September 2024 14:55:59 -0400 (0:00:00.161) 0:00:21.131 ***** 15980 1727204159.72143: entering _queue_task() for managed-node2/gather_facts 15980 1727204159.72437: worker is 1 (out of 1 available) 15980 1727204159.72452: exiting _queue_task() for managed-node2/gather_facts 15980 1727204159.72465: done queuing things up, now waiting for results queue to drain 15980 1727204159.72469: waiting for pending results... 15980 1727204159.72718: running TaskExecutor() for managed-node2/TASK: Gathering Facts 15980 1727204159.72805: in run() - task 127b8e07-fff9-5f1d-4b72-00000000024e 15980 1727204159.72820: variable 'ansible_search_path' from source: unknown 15980 1727204159.72882: calling self._execute() 15980 1727204159.72959: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204159.72963: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204159.72973: variable 'omit' from source: magic vars 15980 1727204159.73387: variable 'ansible_distribution_major_version' from source: facts 15980 1727204159.73399: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204159.73405: variable 'omit' from source: magic vars 15980 1727204159.73434: variable 'omit' from source: magic vars 15980 1727204159.73486: variable 'omit' from source: magic vars 15980 1727204159.73509: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204159.73544: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204159.73583: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204159.73599: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204159.73611: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204159.73639: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204159.73649: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204159.73652: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204159.73725: Set connection var ansible_connection to ssh 15980 1727204159.73734: Set connection var ansible_pipelining to False 15980 1727204159.73741: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204159.73746: Set connection var ansible_timeout to 10 15980 1727204159.73753: Set connection var ansible_shell_type to sh 15980 1727204159.73762: Set connection var ansible_shell_executable to /bin/sh 15980 1727204159.73783: variable 'ansible_shell_executable' from source: unknown 15980 1727204159.73786: variable 'ansible_connection' from source: unknown 15980 1727204159.73788: variable 'ansible_module_compression' from source: unknown 15980 1727204159.73791: variable 'ansible_shell_type' from source: unknown 15980 1727204159.73823: variable 'ansible_shell_executable' from source: unknown 15980 1727204159.73826: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204159.73828: variable 'ansible_pipelining' from source: unknown 15980 1727204159.73831: variable 'ansible_timeout' from source: unknown 15980 1727204159.73834: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204159.73981: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15980 1727204159.73992: variable 'omit' from source: magic vars 15980 1727204159.73997: starting attempt loop 15980 1727204159.74000: running the handler 15980 1727204159.74071: variable 'ansible_facts' from source: unknown 15980 1727204159.74074: _low_level_execute_command(): starting 15980 1727204159.74077: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15980 1727204159.74836: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204159.74862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204159.74927: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204159.74932: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204159.74935: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204159.75010: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204159.76770: stdout chunk (state=3): >>>/root <<< 15980 1727204159.76884: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204159.76972: stderr chunk (state=3): >>><<< 15980 1727204159.76976: stdout chunk (state=3): >>><<< 15980 1727204159.77002: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204159.77023: _low_level_execute_command(): starting 15980 1727204159.77039: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204159.7700942-17837-190993465976749 `" && echo ansible-tmp-1727204159.7700942-17837-190993465976749="` echo /root/.ansible/tmp/ansible-tmp-1727204159.7700942-17837-190993465976749 `" ) && sleep 0' 15980 1727204159.77699: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204159.77702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 15980 1727204159.77706: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 15980 1727204159.77718: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 15980 1727204159.77721: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204159.77784: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204159.77805: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204159.77807: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204159.77862: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204159.79838: stdout chunk (state=3): >>>ansible-tmp-1727204159.7700942-17837-190993465976749=/root/.ansible/tmp/ansible-tmp-1727204159.7700942-17837-190993465976749 <<< 15980 1727204159.79972: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204159.80024: stderr chunk (state=3): >>><<< 15980 1727204159.80028: stdout chunk (state=3): >>><<< 15980 1727204159.80045: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204159.7700942-17837-190993465976749=/root/.ansible/tmp/ansible-tmp-1727204159.7700942-17837-190993465976749 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204159.80079: variable 'ansible_module_compression' from source: unknown 15980 1727204159.80123: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15980vtkmnsww/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15980 1727204159.80186: variable 'ansible_facts' from source: unknown 15980 1727204159.80291: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204159.7700942-17837-190993465976749/AnsiballZ_setup.py 15980 1727204159.80416: Sending initial data 15980 1727204159.80419: Sent initial data (154 bytes) 15980 1727204159.80963: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204159.80969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15980 1727204159.80972: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15980 1727204159.80975: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 15980 1727204159.80979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204159.81028: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204159.81031: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204159.81034: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204159.81161: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204159.82748: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15980 1727204159.82847: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15980 1727204159.82957: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15980vtkmnsww/tmpln41q0g7 /root/.ansible/tmp/ansible-tmp-1727204159.7700942-17837-190993465976749/AnsiballZ_setup.py <<< 15980 1727204159.82963: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204159.7700942-17837-190993465976749/AnsiballZ_setup.py" <<< 15980 1727204159.83012: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15980vtkmnsww/tmpln41q0g7" to remote "/root/.ansible/tmp/ansible-tmp-1727204159.7700942-17837-190993465976749/AnsiballZ_setup.py" <<< 15980 1727204159.83022: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204159.7700942-17837-190993465976749/AnsiballZ_setup.py" <<< 15980 1727204159.84671: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204159.84761: stderr chunk (state=3): >>><<< 15980 1727204159.84843: stdout chunk (state=3): >>><<< 15980 1727204159.84877: done transferring module to remote 15980 1727204159.84890: _low_level_execute_command(): starting 15980 1727204159.84895: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204159.7700942-17837-190993465976749/ /root/.ansible/tmp/ansible-tmp-1727204159.7700942-17837-190993465976749/AnsiballZ_setup.py && sleep 0' 15980 1727204159.85598: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204159.85638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204159.85642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204159.85697: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204159.85700: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204159.85781: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204159.87717: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204159.87745: stderr chunk (state=3): >>><<< 15980 1727204159.87749: stdout chunk (state=3): >>><<< 15980 1727204159.87768: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204159.87818: _low_level_execute_command(): starting 15980 1727204159.87823: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204159.7700942-17837-190993465976749/AnsiballZ_setup.py && sleep 0' 15980 1727204159.88837: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204159.88842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204159.88844: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204159.88847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204159.88900: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204159.88904: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204159.88961: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204159.89027: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204160.56524: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_is_chroot": false, "ansible_fips": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "00", "epoch": "1727204160", "epoch_int": "1727204160", "date": "2024-09-24", "time": "14:56:00", "iso8601_micro": "2024-09-24T18:56:00.186229Z", "iso8601": "2024-09-24T18:56:00Z", "iso8601_basic": "20240924T145600186229", "iso8601_basic_short": "20240924T145600", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_iscsi_iqn": "", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root<<< 15980 1727204160.56541: stdout chunk (state=3): >>>": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3046, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 670, "free": 3046}, "nocache": {"free": 3475, "used": 241}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_uuid": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 506, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251325702144, "block_size": 4096, "block_total": 64479564, "block_available": 61358814, "block_used": 3120750, "inode_total": 16384000, "inode_available": 16301509, "inode_used": 82491, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_lsb": {}, "ansible_interfaces": ["lo", "eth0", "LSR-TST-br31"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::f7:13ff:fe22:8fc1", "prefix": "64", "scope": "link"}]}, "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "42:c8:ed:20:3e:1e", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.73"], "ansible_all_ipv6_addresses": ["fe80::f7:13ff:fe22:8fc1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.73", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::f7:13ff:fe22:8fc1"]}, "ansible_fibre_channel_wwn": [], "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_local": {}, "ansible_loadavg": {"1m": 0.77734375, "5m": 0.52783203125, "15m": 0.2587890625}, "ansible_apparmor": {"status": "disabled"}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15980 1727204160.58631: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 15980 1727204160.58696: stderr chunk (state=3): >>><<< 15980 1727204160.58699: stdout chunk (state=3): >>><<< 15980 1727204160.58721: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_is_chroot": false, "ansible_fips": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "00", "epoch": "1727204160", "epoch_int": "1727204160", "date": "2024-09-24", "time": "14:56:00", "iso8601_micro": "2024-09-24T18:56:00.186229Z", "iso8601": "2024-09-24T18:56:00Z", "iso8601_basic": "20240924T145600186229", "iso8601_basic_short": "20240924T145600", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_iscsi_iqn": "", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3046, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 670, "free": 3046}, "nocache": {"free": 3475, "used": 241}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_uuid": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 506, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251325702144, "block_size": 4096, "block_total": 64479564, "block_available": 61358814, "block_used": 3120750, "inode_total": 16384000, "inode_available": 16301509, "inode_used": 82491, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_lsb": {}, "ansible_interfaces": ["lo", "eth0", "LSR-TST-br31"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::f7:13ff:fe22:8fc1", "prefix": "64", "scope": "link"}]}, "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "42:c8:ed:20:3e:1e", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.73"], "ansible_all_ipv6_addresses": ["fe80::f7:13ff:fe22:8fc1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.73", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::f7:13ff:fe22:8fc1"]}, "ansible_fibre_channel_wwn": [], "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_local": {}, "ansible_loadavg": {"1m": 0.77734375, "5m": 0.52783203125, "15m": 0.2587890625}, "ansible_apparmor": {"status": "disabled"}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 15980 1727204160.58939: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204159.7700942-17837-190993465976749/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15980 1727204160.58949: _low_level_execute_command(): starting 15980 1727204160.58954: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204159.7700942-17837-190993465976749/ > /dev/null 2>&1 && sleep 0' 15980 1727204160.59506: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 15980 1727204160.59511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204160.59555: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204160.59584: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204160.59697: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204160.61687: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204160.61806: stderr chunk (state=3): >>><<< 15980 1727204160.61811: stdout chunk (state=3): >>><<< 15980 1727204160.61814: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204160.61817: handler run complete 15980 1727204160.61976: variable 'ansible_facts' from source: unknown 15980 1727204160.62052: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204160.62589: variable 'ansible_facts' from source: unknown 15980 1727204160.62703: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204160.62782: attempt loop complete, returning result 15980 1727204160.62785: _execute() done 15980 1727204160.62805: dumping result to json 15980 1727204160.62813: done dumping result, returning 15980 1727204160.62816: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [127b8e07-fff9-5f1d-4b72-00000000024e] 15980 1727204160.62819: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000024e 15980 1727204160.63052: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000024e 15980 1727204160.63055: WORKER PROCESS EXITING ok: [managed-node2] 15980 1727204160.63411: no more pending results, returning what we have 15980 1727204160.63415: results queue empty 15980 1727204160.63415: checking for any_errors_fatal 15980 1727204160.63416: done checking for any_errors_fatal 15980 1727204160.63417: checking for max_fail_percentage 15980 1727204160.63418: done checking for max_fail_percentage 15980 1727204160.63419: checking to see if all hosts have failed and the running result is not ok 15980 1727204160.63420: done checking to see if all hosts have failed 15980 1727204160.63421: getting the remaining hosts for this loop 15980 1727204160.63422: done getting the remaining hosts for this loop 15980 1727204160.63426: getting the next task for host managed-node2 15980 1727204160.63430: done getting next task for host managed-node2 15980 1727204160.63432: ^ task is: TASK: meta (flush_handlers) 15980 1727204160.63433: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204160.63436: getting variables 15980 1727204160.63437: in VariableManager get_vars() 15980 1727204160.63455: Calling all_inventory to load vars for managed-node2 15980 1727204160.63457: Calling groups_inventory to load vars for managed-node2 15980 1727204160.63459: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204160.63470: Calling all_plugins_play to load vars for managed-node2 15980 1727204160.63472: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204160.63474: Calling groups_plugins_play to load vars for managed-node2 15980 1727204160.69191: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204160.70797: done with get_vars() 15980 1727204160.70825: done getting variables 15980 1727204160.70878: in VariableManager get_vars() 15980 1727204160.70886: Calling all_inventory to load vars for managed-node2 15980 1727204160.70887: Calling groups_inventory to load vars for managed-node2 15980 1727204160.70889: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204160.70893: Calling all_plugins_play to load vars for managed-node2 15980 1727204160.70894: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204160.70896: Calling groups_plugins_play to load vars for managed-node2 15980 1727204160.71827: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204160.73012: done with get_vars() 15980 1727204160.73042: done queuing things up, now waiting for results queue to drain 15980 1727204160.73044: results queue empty 15980 1727204160.73045: checking for any_errors_fatal 15980 1727204160.73050: done checking for any_errors_fatal 15980 1727204160.73050: checking for max_fail_percentage 15980 1727204160.73051: done checking for max_fail_percentage 15980 1727204160.73052: checking to see if all hosts have failed and the running result is not ok 15980 1727204160.73052: done checking to see if all hosts have failed 15980 1727204160.73057: getting the remaining hosts for this loop 15980 1727204160.73058: done getting the remaining hosts for this loop 15980 1727204160.73060: getting the next task for host managed-node2 15980 1727204160.73063: done getting next task for host managed-node2 15980 1727204160.73067: ^ task is: TASK: Include the task '{{ task }}' 15980 1727204160.73068: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204160.73070: getting variables 15980 1727204160.73071: in VariableManager get_vars() 15980 1727204160.73078: Calling all_inventory to load vars for managed-node2 15980 1727204160.73080: Calling groups_inventory to load vars for managed-node2 15980 1727204160.73082: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204160.73087: Calling all_plugins_play to load vars for managed-node2 15980 1727204160.73089: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204160.73090: Calling groups_plugins_play to load vars for managed-node2 15980 1727204160.73942: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204160.75261: done with get_vars() 15980 1727204160.75288: done getting variables 15980 1727204160.75407: variable 'task' from source: play vars TASK [Include the task 'tasks/assert_profile_present.yml'] ********************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:6 Tuesday 24 September 2024 14:56:00 -0400 (0:00:01.032) 0:00:22.164 ***** 15980 1727204160.75428: entering _queue_task() for managed-node2/include_tasks 15980 1727204160.75729: worker is 1 (out of 1 available) 15980 1727204160.75746: exiting _queue_task() for managed-node2/include_tasks 15980 1727204160.75758: done queuing things up, now waiting for results queue to drain 15980 1727204160.75760: waiting for pending results... 15980 1727204160.75958: running TaskExecutor() for managed-node2/TASK: Include the task 'tasks/assert_profile_present.yml' 15980 1727204160.76041: in run() - task 127b8e07-fff9-5f1d-4b72-000000000031 15980 1727204160.76051: variable 'ansible_search_path' from source: unknown 15980 1727204160.76084: calling self._execute() 15980 1727204160.76170: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204160.76176: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204160.76186: variable 'omit' from source: magic vars 15980 1727204160.76527: variable 'ansible_distribution_major_version' from source: facts 15980 1727204160.76543: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204160.76547: variable 'task' from source: play vars 15980 1727204160.76605: variable 'task' from source: play vars 15980 1727204160.76612: _execute() done 15980 1727204160.76615: dumping result to json 15980 1727204160.76618: done dumping result, returning 15980 1727204160.76628: done running TaskExecutor() for managed-node2/TASK: Include the task 'tasks/assert_profile_present.yml' [127b8e07-fff9-5f1d-4b72-000000000031] 15980 1727204160.76631: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000031 15980 1727204160.76734: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000031 15980 1727204160.76737: WORKER PROCESS EXITING 15980 1727204160.76779: no more pending results, returning what we have 15980 1727204160.76785: in VariableManager get_vars() 15980 1727204160.76823: Calling all_inventory to load vars for managed-node2 15980 1727204160.76829: Calling groups_inventory to load vars for managed-node2 15980 1727204160.76833: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204160.76849: Calling all_plugins_play to load vars for managed-node2 15980 1727204160.76852: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204160.76855: Calling groups_plugins_play to load vars for managed-node2 15980 1727204160.78025: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204160.79700: done with get_vars() 15980 1727204160.79730: variable 'ansible_search_path' from source: unknown 15980 1727204160.79751: we have included files to process 15980 1727204160.79752: generating all_blocks data 15980 1727204160.79754: done generating all_blocks data 15980 1727204160.79755: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 15980 1727204160.79756: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 15980 1727204160.79759: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 15980 1727204160.79940: in VariableManager get_vars() 15980 1727204160.79953: done with get_vars() 15980 1727204160.80196: done processing included file 15980 1727204160.80198: iterating over new_blocks loaded from include file 15980 1727204160.80200: in VariableManager get_vars() 15980 1727204160.80213: done with get_vars() 15980 1727204160.80215: filtering new block on tags 15980 1727204160.80238: done filtering new block on tags 15980 1727204160.80243: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed-node2 15980 1727204160.80249: extending task lists for all hosts with included blocks 15980 1727204160.80292: done extending task lists 15980 1727204160.80293: done processing included files 15980 1727204160.80294: results queue empty 15980 1727204160.80295: checking for any_errors_fatal 15980 1727204160.80297: done checking for any_errors_fatal 15980 1727204160.80297: checking for max_fail_percentage 15980 1727204160.80299: done checking for max_fail_percentage 15980 1727204160.80299: checking to see if all hosts have failed and the running result is not ok 15980 1727204160.80300: done checking to see if all hosts have failed 15980 1727204160.80301: getting the remaining hosts for this loop 15980 1727204160.80302: done getting the remaining hosts for this loop 15980 1727204160.80305: getting the next task for host managed-node2 15980 1727204160.80309: done getting next task for host managed-node2 15980 1727204160.80312: ^ task is: TASK: Include the task 'get_profile_stat.yml' 15980 1727204160.80314: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204160.80319: getting variables 15980 1727204160.80320: in VariableManager get_vars() 15980 1727204160.80332: Calling all_inventory to load vars for managed-node2 15980 1727204160.80334: Calling groups_inventory to load vars for managed-node2 15980 1727204160.80337: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204160.80343: Calling all_plugins_play to load vars for managed-node2 15980 1727204160.80346: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204160.80349: Calling groups_plugins_play to load vars for managed-node2 15980 1727204160.81376: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204160.82562: done with get_vars() 15980 1727204160.82589: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Tuesday 24 September 2024 14:56:00 -0400 (0:00:00.072) 0:00:22.236 ***** 15980 1727204160.82658: entering _queue_task() for managed-node2/include_tasks 15980 1727204160.82955: worker is 1 (out of 1 available) 15980 1727204160.82971: exiting _queue_task() for managed-node2/include_tasks 15980 1727204160.82984: done queuing things up, now waiting for results queue to drain 15980 1727204160.82986: waiting for pending results... 15980 1727204160.83176: running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' 15980 1727204160.83272: in run() - task 127b8e07-fff9-5f1d-4b72-00000000025f 15980 1727204160.83283: variable 'ansible_search_path' from source: unknown 15980 1727204160.83287: variable 'ansible_search_path' from source: unknown 15980 1727204160.83326: calling self._execute() 15980 1727204160.83403: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204160.83413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204160.83422: variable 'omit' from source: magic vars 15980 1727204160.83743: variable 'ansible_distribution_major_version' from source: facts 15980 1727204160.83759: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204160.83765: _execute() done 15980 1727204160.83768: dumping result to json 15980 1727204160.83771: done dumping result, returning 15980 1727204160.83775: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' [127b8e07-fff9-5f1d-4b72-00000000025f] 15980 1727204160.83778: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000025f 15980 1727204160.83878: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000025f 15980 1727204160.83880: WORKER PROCESS EXITING 15980 1727204160.83910: no more pending results, returning what we have 15980 1727204160.83917: in VariableManager get_vars() 15980 1727204160.83953: Calling all_inventory to load vars for managed-node2 15980 1727204160.83957: Calling groups_inventory to load vars for managed-node2 15980 1727204160.83961: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204160.83980: Calling all_plugins_play to load vars for managed-node2 15980 1727204160.83983: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204160.83986: Calling groups_plugins_play to load vars for managed-node2 15980 1727204160.85037: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204160.86234: done with get_vars() 15980 1727204160.86261: variable 'ansible_search_path' from source: unknown 15980 1727204160.86263: variable 'ansible_search_path' from source: unknown 15980 1727204160.86274: variable 'task' from source: play vars 15980 1727204160.86367: variable 'task' from source: play vars 15980 1727204160.86396: we have included files to process 15980 1727204160.86397: generating all_blocks data 15980 1727204160.86398: done generating all_blocks data 15980 1727204160.86399: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15980 1727204160.86400: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15980 1727204160.86402: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15980 1727204160.87193: done processing included file 15980 1727204160.87195: iterating over new_blocks loaded from include file 15980 1727204160.87196: in VariableManager get_vars() 15980 1727204160.87206: done with get_vars() 15980 1727204160.87209: filtering new block on tags 15980 1727204160.87230: done filtering new block on tags 15980 1727204160.87233: in VariableManager get_vars() 15980 1727204160.87244: done with get_vars() 15980 1727204160.87245: filtering new block on tags 15980 1727204160.87259: done filtering new block on tags 15980 1727204160.87261: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node2 15980 1727204160.87266: extending task lists for all hosts with included blocks 15980 1727204160.87382: done extending task lists 15980 1727204160.87383: done processing included files 15980 1727204160.87383: results queue empty 15980 1727204160.87384: checking for any_errors_fatal 15980 1727204160.87387: done checking for any_errors_fatal 15980 1727204160.87387: checking for max_fail_percentage 15980 1727204160.87388: done checking for max_fail_percentage 15980 1727204160.87388: checking to see if all hosts have failed and the running result is not ok 15980 1727204160.87389: done checking to see if all hosts have failed 15980 1727204160.87390: getting the remaining hosts for this loop 15980 1727204160.87391: done getting the remaining hosts for this loop 15980 1727204160.87393: getting the next task for host managed-node2 15980 1727204160.87396: done getting next task for host managed-node2 15980 1727204160.87398: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 15980 1727204160.87400: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204160.87401: getting variables 15980 1727204160.87402: in VariableManager get_vars() 15980 1727204160.87556: Calling all_inventory to load vars for managed-node2 15980 1727204160.87559: Calling groups_inventory to load vars for managed-node2 15980 1727204160.87561: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204160.87568: Calling all_plugins_play to load vars for managed-node2 15980 1727204160.87570: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204160.87573: Calling groups_plugins_play to load vars for managed-node2 15980 1727204160.88391: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204160.89586: done with get_vars() 15980 1727204160.89613: done getting variables 15980 1727204160.89655: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:56:00 -0400 (0:00:00.070) 0:00:22.307 ***** 15980 1727204160.89683: entering _queue_task() for managed-node2/set_fact 15980 1727204160.90010: worker is 1 (out of 1 available) 15980 1727204160.90023: exiting _queue_task() for managed-node2/set_fact 15980 1727204160.90036: done queuing things up, now waiting for results queue to drain 15980 1727204160.90038: waiting for pending results... 15980 1727204160.90243: running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag 15980 1727204160.90328: in run() - task 127b8e07-fff9-5f1d-4b72-00000000026c 15980 1727204160.90338: variable 'ansible_search_path' from source: unknown 15980 1727204160.90342: variable 'ansible_search_path' from source: unknown 15980 1727204160.90383: calling self._execute() 15980 1727204160.90471: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204160.90475: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204160.90487: variable 'omit' from source: magic vars 15980 1727204160.90803: variable 'ansible_distribution_major_version' from source: facts 15980 1727204160.90815: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204160.90825: variable 'omit' from source: magic vars 15980 1727204160.90860: variable 'omit' from source: magic vars 15980 1727204160.90892: variable 'omit' from source: magic vars 15980 1727204160.90929: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204160.90967: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204160.90986: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204160.91002: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204160.91013: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204160.91044: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204160.91047: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204160.91051: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204160.91125: Set connection var ansible_connection to ssh 15980 1727204160.91136: Set connection var ansible_pipelining to False 15980 1727204160.91140: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204160.91147: Set connection var ansible_timeout to 10 15980 1727204160.91152: Set connection var ansible_shell_type to sh 15980 1727204160.91158: Set connection var ansible_shell_executable to /bin/sh 15980 1727204160.91184: variable 'ansible_shell_executable' from source: unknown 15980 1727204160.91188: variable 'ansible_connection' from source: unknown 15980 1727204160.91191: variable 'ansible_module_compression' from source: unknown 15980 1727204160.91193: variable 'ansible_shell_type' from source: unknown 15980 1727204160.91196: variable 'ansible_shell_executable' from source: unknown 15980 1727204160.91198: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204160.91203: variable 'ansible_pipelining' from source: unknown 15980 1727204160.91206: variable 'ansible_timeout' from source: unknown 15980 1727204160.91210: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204160.91329: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15980 1727204160.91341: variable 'omit' from source: magic vars 15980 1727204160.91348: starting attempt loop 15980 1727204160.91351: running the handler 15980 1727204160.91363: handler run complete 15980 1727204160.91375: attempt loop complete, returning result 15980 1727204160.91380: _execute() done 15980 1727204160.91382: dumping result to json 15980 1727204160.91385: done dumping result, returning 15980 1727204160.91397: done running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag [127b8e07-fff9-5f1d-4b72-00000000026c] 15980 1727204160.91400: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000026c 15980 1727204160.91489: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000026c 15980 1727204160.91492: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 15980 1727204160.91550: no more pending results, returning what we have 15980 1727204160.91554: results queue empty 15980 1727204160.91555: checking for any_errors_fatal 15980 1727204160.91556: done checking for any_errors_fatal 15980 1727204160.91557: checking for max_fail_percentage 15980 1727204160.91558: done checking for max_fail_percentage 15980 1727204160.91559: checking to see if all hosts have failed and the running result is not ok 15980 1727204160.91560: done checking to see if all hosts have failed 15980 1727204160.91561: getting the remaining hosts for this loop 15980 1727204160.91563: done getting the remaining hosts for this loop 15980 1727204160.91569: getting the next task for host managed-node2 15980 1727204160.91578: done getting next task for host managed-node2 15980 1727204160.91580: ^ task is: TASK: Stat profile file 15980 1727204160.91584: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204160.91590: getting variables 15980 1727204160.91592: in VariableManager get_vars() 15980 1727204160.91624: Calling all_inventory to load vars for managed-node2 15980 1727204160.91627: Calling groups_inventory to load vars for managed-node2 15980 1727204160.91631: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204160.91643: Calling all_plugins_play to load vars for managed-node2 15980 1727204160.91646: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204160.91648: Calling groups_plugins_play to load vars for managed-node2 15980 1727204160.92897: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204160.95110: done with get_vars() 15980 1727204160.95153: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:56:00 -0400 (0:00:00.055) 0:00:22.362 ***** 15980 1727204160.95261: entering _queue_task() for managed-node2/stat 15980 1727204160.95656: worker is 1 (out of 1 available) 15980 1727204160.95669: exiting _queue_task() for managed-node2/stat 15980 1727204160.95683: done queuing things up, now waiting for results queue to drain 15980 1727204160.95685: waiting for pending results... 15980 1727204160.96046: running TaskExecutor() for managed-node2/TASK: Stat profile file 15980 1727204160.96185: in run() - task 127b8e07-fff9-5f1d-4b72-00000000026d 15980 1727204160.96188: variable 'ansible_search_path' from source: unknown 15980 1727204160.96191: variable 'ansible_search_path' from source: unknown 15980 1727204160.96196: calling self._execute() 15980 1727204160.96312: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204160.96328: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204160.96344: variable 'omit' from source: magic vars 15980 1727204160.96795: variable 'ansible_distribution_major_version' from source: facts 15980 1727204160.96836: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204160.96840: variable 'omit' from source: magic vars 15980 1727204160.96892: variable 'omit' from source: magic vars 15980 1727204160.97053: variable 'profile' from source: play vars 15980 1727204160.97056: variable 'interface' from source: set_fact 15980 1727204160.97109: variable 'interface' from source: set_fact 15980 1727204160.97137: variable 'omit' from source: magic vars 15980 1727204160.97194: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204160.97242: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204160.97277: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204160.97372: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204160.97375: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204160.97382: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204160.97385: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204160.97387: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204160.97489: Set connection var ansible_connection to ssh 15980 1727204160.97507: Set connection var ansible_pipelining to False 15980 1727204160.97519: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204160.97534: Set connection var ansible_timeout to 10 15980 1727204160.97546: Set connection var ansible_shell_type to sh 15980 1727204160.97557: Set connection var ansible_shell_executable to /bin/sh 15980 1727204160.97596: variable 'ansible_shell_executable' from source: unknown 15980 1727204160.97611: variable 'ansible_connection' from source: unknown 15980 1727204160.97619: variable 'ansible_module_compression' from source: unknown 15980 1727204160.97630: variable 'ansible_shell_type' from source: unknown 15980 1727204160.97638: variable 'ansible_shell_executable' from source: unknown 15980 1727204160.97671: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204160.97674: variable 'ansible_pipelining' from source: unknown 15980 1727204160.97677: variable 'ansible_timeout' from source: unknown 15980 1727204160.97679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204160.97868: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15980 1727204160.97878: variable 'omit' from source: magic vars 15980 1727204160.97884: starting attempt loop 15980 1727204160.97887: running the handler 15980 1727204160.97899: _low_level_execute_command(): starting 15980 1727204160.97907: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15980 1727204160.98471: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204160.98475: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15980 1727204160.98480: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204160.98538: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204160.98542: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204160.98544: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204160.98628: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204161.00437: stdout chunk (state=3): >>>/root <<< 15980 1727204161.00574: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204161.00710: stderr chunk (state=3): >>><<< 15980 1727204161.00714: stdout chunk (state=3): >>><<< 15980 1727204161.00737: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204161.00749: _low_level_execute_command(): starting 15980 1727204161.00759: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204161.007377-17874-108986964994738 `" && echo ansible-tmp-1727204161.007377-17874-108986964994738="` echo /root/.ansible/tmp/ansible-tmp-1727204161.007377-17874-108986964994738 `" ) && sleep 0' 15980 1727204161.01605: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204161.01619: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204161.01647: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204161.01675: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204161.01875: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204161.03898: stdout chunk (state=3): >>>ansible-tmp-1727204161.007377-17874-108986964994738=/root/.ansible/tmp/ansible-tmp-1727204161.007377-17874-108986964994738 <<< 15980 1727204161.04027: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204161.04082: stderr chunk (state=3): >>><<< 15980 1727204161.04086: stdout chunk (state=3): >>><<< 15980 1727204161.04101: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204161.007377-17874-108986964994738=/root/.ansible/tmp/ansible-tmp-1727204161.007377-17874-108986964994738 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204161.04148: variable 'ansible_module_compression' from source: unknown 15980 1727204161.04203: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15980vtkmnsww/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 15980 1727204161.04240: variable 'ansible_facts' from source: unknown 15980 1727204161.04312: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204161.007377-17874-108986964994738/AnsiballZ_stat.py 15980 1727204161.04430: Sending initial data 15980 1727204161.04434: Sent initial data (152 bytes) 15980 1727204161.04940: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204161.04944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204161.04947: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204161.04949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15980 1727204161.04952: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204161.05018: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204161.05021: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204161.05023: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204161.05091: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204161.07009: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15980 1727204161.07081: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15980 1727204161.07160: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15980vtkmnsww/tmpxiq2816e /root/.ansible/tmp/ansible-tmp-1727204161.007377-17874-108986964994738/AnsiballZ_stat.py <<< 15980 1727204161.07164: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204161.007377-17874-108986964994738/AnsiballZ_stat.py" <<< 15980 1727204161.07259: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15980vtkmnsww/tmpxiq2816e" to remote "/root/.ansible/tmp/ansible-tmp-1727204161.007377-17874-108986964994738/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204161.007377-17874-108986964994738/AnsiballZ_stat.py" <<< 15980 1727204161.08660: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204161.08874: stderr chunk (state=3): >>><<< 15980 1727204161.08877: stdout chunk (state=3): >>><<< 15980 1727204161.08880: done transferring module to remote 15980 1727204161.08882: _low_level_execute_command(): starting 15980 1727204161.08884: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204161.007377-17874-108986964994738/ /root/.ansible/tmp/ansible-tmp-1727204161.007377-17874-108986964994738/AnsiballZ_stat.py && sleep 0' 15980 1727204161.10037: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204161.10385: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204161.10504: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204161.10620: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204161.12588: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204161.12702: stderr chunk (state=3): >>><<< 15980 1727204161.12888: stdout chunk (state=3): >>><<< 15980 1727204161.13011: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204161.13015: _low_level_execute_command(): starting 15980 1727204161.13018: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204161.007377-17874-108986964994738/AnsiballZ_stat.py && sleep 0' 15980 1727204161.14299: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204161.14334: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204161.14348: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15980 1727204161.14524: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204161.14918: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204161.14922: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204161.31806: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 15980 1727204161.33305: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 15980 1727204161.33316: stdout chunk (state=3): >>><<< 15980 1727204161.33332: stderr chunk (state=3): >>><<< 15980 1727204161.33358: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 15980 1727204161.33417: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204161.007377-17874-108986964994738/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15980 1727204161.33773: _low_level_execute_command(): starting 15980 1727204161.33777: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204161.007377-17874-108986964994738/ > /dev/null 2>&1 && sleep 0' 15980 1727204161.34948: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204161.34953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204161.34957: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204161.35042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204161.35048: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204161.35255: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204161.35321: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204161.37574: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204161.37605: stderr chunk (state=3): >>><<< 15980 1727204161.37609: stdout chunk (state=3): >>><<< 15980 1727204161.37772: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204161.37776: handler run complete 15980 1727204161.37779: attempt loop complete, returning result 15980 1727204161.37781: _execute() done 15980 1727204161.37783: dumping result to json 15980 1727204161.37785: done dumping result, returning 15980 1727204161.37787: done running TaskExecutor() for managed-node2/TASK: Stat profile file [127b8e07-fff9-5f1d-4b72-00000000026d] 15980 1727204161.37789: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000026d 15980 1727204161.37873: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000026d 15980 1727204161.37877: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 15980 1727204161.38093: no more pending results, returning what we have 15980 1727204161.38098: results queue empty 15980 1727204161.38099: checking for any_errors_fatal 15980 1727204161.38106: done checking for any_errors_fatal 15980 1727204161.38107: checking for max_fail_percentage 15980 1727204161.38109: done checking for max_fail_percentage 15980 1727204161.38110: checking to see if all hosts have failed and the running result is not ok 15980 1727204161.38111: done checking to see if all hosts have failed 15980 1727204161.38112: getting the remaining hosts for this loop 15980 1727204161.38114: done getting the remaining hosts for this loop 15980 1727204161.38120: getting the next task for host managed-node2 15980 1727204161.38128: done getting next task for host managed-node2 15980 1727204161.38130: ^ task is: TASK: Set NM profile exist flag based on the profile files 15980 1727204161.38135: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204161.38140: getting variables 15980 1727204161.38142: in VariableManager get_vars() 15980 1727204161.38179: Calling all_inventory to load vars for managed-node2 15980 1727204161.38182: Calling groups_inventory to load vars for managed-node2 15980 1727204161.38186: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204161.38200: Calling all_plugins_play to load vars for managed-node2 15980 1727204161.38202: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204161.38205: Calling groups_plugins_play to load vars for managed-node2 15980 1727204161.42676: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204161.46604: done with get_vars() 15980 1727204161.46636: done getting variables 15980 1727204161.46744: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:56:01 -0400 (0:00:00.515) 0:00:22.878 ***** 15980 1727204161.46781: entering _queue_task() for managed-node2/set_fact 15980 1727204161.47203: worker is 1 (out of 1 available) 15980 1727204161.47218: exiting _queue_task() for managed-node2/set_fact 15980 1727204161.47235: done queuing things up, now waiting for results queue to drain 15980 1727204161.47237: waiting for pending results... 15980 1727204161.47521: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files 15980 1727204161.47603: in run() - task 127b8e07-fff9-5f1d-4b72-00000000026e 15980 1727204161.47624: variable 'ansible_search_path' from source: unknown 15980 1727204161.47631: variable 'ansible_search_path' from source: unknown 15980 1727204161.47772: calling self._execute() 15980 1727204161.47792: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204161.47805: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204161.47823: variable 'omit' from source: magic vars 15980 1727204161.48263: variable 'ansible_distribution_major_version' from source: facts 15980 1727204161.48285: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204161.48417: variable 'profile_stat' from source: set_fact 15980 1727204161.48437: Evaluated conditional (profile_stat.stat.exists): False 15980 1727204161.48444: when evaluation is False, skipping this task 15980 1727204161.48451: _execute() done 15980 1727204161.48457: dumping result to json 15980 1727204161.48465: done dumping result, returning 15980 1727204161.48477: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files [127b8e07-fff9-5f1d-4b72-00000000026e] 15980 1727204161.48486: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000026e skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15980 1727204161.48737: no more pending results, returning what we have 15980 1727204161.48742: results queue empty 15980 1727204161.48743: checking for any_errors_fatal 15980 1727204161.48752: done checking for any_errors_fatal 15980 1727204161.48752: checking for max_fail_percentage 15980 1727204161.48754: done checking for max_fail_percentage 15980 1727204161.48755: checking to see if all hosts have failed and the running result is not ok 15980 1727204161.48756: done checking to see if all hosts have failed 15980 1727204161.48757: getting the remaining hosts for this loop 15980 1727204161.48758: done getting the remaining hosts for this loop 15980 1727204161.48763: getting the next task for host managed-node2 15980 1727204161.48772: done getting next task for host managed-node2 15980 1727204161.48775: ^ task is: TASK: Get NM profile info 15980 1727204161.48783: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204161.48788: getting variables 15980 1727204161.48790: in VariableManager get_vars() 15980 1727204161.48825: Calling all_inventory to load vars for managed-node2 15980 1727204161.48830: Calling groups_inventory to load vars for managed-node2 15980 1727204161.48950: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204161.48971: Calling all_plugins_play to load vars for managed-node2 15980 1727204161.48976: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204161.48979: Calling groups_plugins_play to load vars for managed-node2 15980 1727204161.50128: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000026e 15980 1727204161.50136: WORKER PROCESS EXITING 15980 1727204161.51618: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204161.55005: done with get_vars() 15980 1727204161.55045: done getting variables 15980 1727204161.55163: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:56:01 -0400 (0:00:00.084) 0:00:22.962 ***** 15980 1727204161.55205: entering _queue_task() for managed-node2/shell 15980 1727204161.55208: Creating lock for shell 15980 1727204161.55633: worker is 1 (out of 1 available) 15980 1727204161.55648: exiting _queue_task() for managed-node2/shell 15980 1727204161.55663: done queuing things up, now waiting for results queue to drain 15980 1727204161.55670: waiting for pending results... 15980 1727204161.55943: running TaskExecutor() for managed-node2/TASK: Get NM profile info 15980 1727204161.56111: in run() - task 127b8e07-fff9-5f1d-4b72-00000000026f 15980 1727204161.56143: variable 'ansible_search_path' from source: unknown 15980 1727204161.56155: variable 'ansible_search_path' from source: unknown 15980 1727204161.56208: calling self._execute() 15980 1727204161.56358: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204161.56373: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204161.56396: variable 'omit' from source: magic vars 15980 1727204161.57275: variable 'ansible_distribution_major_version' from source: facts 15980 1727204161.57541: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204161.57546: variable 'omit' from source: magic vars 15980 1727204161.57548: variable 'omit' from source: magic vars 15980 1727204161.57833: variable 'profile' from source: play vars 15980 1727204161.57876: variable 'interface' from source: set_fact 15980 1727204161.58068: variable 'interface' from source: set_fact 15980 1727204161.58139: variable 'omit' from source: magic vars 15980 1727204161.58196: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204161.58260: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204161.58290: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204161.58336: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204161.58356: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204161.58416: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204161.58419: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204161.58422: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204161.58543: Set connection var ansible_connection to ssh 15980 1727204161.58634: Set connection var ansible_pipelining to False 15980 1727204161.58638: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204161.58640: Set connection var ansible_timeout to 10 15980 1727204161.58648: Set connection var ansible_shell_type to sh 15980 1727204161.58650: Set connection var ansible_shell_executable to /bin/sh 15980 1727204161.58653: variable 'ansible_shell_executable' from source: unknown 15980 1727204161.58655: variable 'ansible_connection' from source: unknown 15980 1727204161.58657: variable 'ansible_module_compression' from source: unknown 15980 1727204161.58662: variable 'ansible_shell_type' from source: unknown 15980 1727204161.58672: variable 'ansible_shell_executable' from source: unknown 15980 1727204161.58678: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204161.58685: variable 'ansible_pipelining' from source: unknown 15980 1727204161.58692: variable 'ansible_timeout' from source: unknown 15980 1727204161.58700: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204161.58884: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15980 1727204161.58903: variable 'omit' from source: magic vars 15980 1727204161.58915: starting attempt loop 15980 1727204161.58923: running the handler 15980 1727204161.58942: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15980 1727204161.59071: _low_level_execute_command(): starting 15980 1727204161.59076: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15980 1727204161.60205: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204161.60259: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204161.60331: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204161.60356: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204161.60543: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204161.62469: stdout chunk (state=3): >>>/root <<< 15980 1727204161.62474: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204161.62615: stderr chunk (state=3): >>><<< 15980 1727204161.62620: stdout chunk (state=3): >>><<< 15980 1727204161.62905: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204161.62909: _low_level_execute_command(): starting 15980 1727204161.62912: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204161.627056-17955-35943424659710 `" && echo ansible-tmp-1727204161.627056-17955-35943424659710="` echo /root/.ansible/tmp/ansible-tmp-1727204161.627056-17955-35943424659710 `" ) && sleep 0' 15980 1727204161.64592: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204161.64688: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204161.64774: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204161.64969: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204161.65044: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204161.65303: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204161.67398: stdout chunk (state=3): >>>ansible-tmp-1727204161.627056-17955-35943424659710=/root/.ansible/tmp/ansible-tmp-1727204161.627056-17955-35943424659710 <<< 15980 1727204161.67581: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204161.67595: stdout chunk (state=3): >>><<< 15980 1727204161.67609: stderr chunk (state=3): >>><<< 15980 1727204161.67634: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204161.627056-17955-35943424659710=/root/.ansible/tmp/ansible-tmp-1727204161.627056-17955-35943424659710 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204161.67876: variable 'ansible_module_compression' from source: unknown 15980 1727204161.67880: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15980vtkmnsww/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15980 1727204161.67985: variable 'ansible_facts' from source: unknown 15980 1727204161.68177: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204161.627056-17955-35943424659710/AnsiballZ_command.py 15980 1727204161.68539: Sending initial data 15980 1727204161.68542: Sent initial data (154 bytes) 15980 1727204161.69893: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204161.69915: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204161.70034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204161.70284: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204161.70391: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204161.72123: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 15980 1727204161.72152: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15980 1727204161.72224: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15980 1727204161.72299: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15980vtkmnsww/tmptu9clbxg /root/.ansible/tmp/ansible-tmp-1727204161.627056-17955-35943424659710/AnsiballZ_command.py <<< 15980 1727204161.72480: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204161.627056-17955-35943424659710/AnsiballZ_command.py" <<< 15980 1727204161.72504: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15980vtkmnsww/tmptu9clbxg" to remote "/root/.ansible/tmp/ansible-tmp-1727204161.627056-17955-35943424659710/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204161.627056-17955-35943424659710/AnsiballZ_command.py" <<< 15980 1727204161.74296: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204161.74333: stderr chunk (state=3): >>><<< 15980 1727204161.74342: stdout chunk (state=3): >>><<< 15980 1727204161.74376: done transferring module to remote 15980 1727204161.74396: _low_level_execute_command(): starting 15980 1727204161.74407: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204161.627056-17955-35943424659710/ /root/.ansible/tmp/ansible-tmp-1727204161.627056-17955-35943424659710/AnsiballZ_command.py && sleep 0' 15980 1727204161.75847: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204161.75863: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204161.75884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204161.76179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204161.76229: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204161.76297: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204161.78232: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204161.78339: stderr chunk (state=3): >>><<< 15980 1727204161.78363: stdout chunk (state=3): >>><<< 15980 1727204161.78388: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204161.78396: _low_level_execute_command(): starting 15980 1727204161.78406: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204161.627056-17955-35943424659710/AnsiballZ_command.py && sleep 0' 15980 1727204161.79076: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204161.79093: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204161.79110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204161.79132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15980 1727204161.79182: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204161.79246: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204161.79277: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204161.79301: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204161.79772: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204161.98053: stdout chunk (state=3): >>> {"changed": true, "stdout": "LSR-TST-br31 /etc/NetworkManager/system-connections/LSR-TST-br31.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "start": "2024-09-24 14:56:01.960415", "end": "2024-09-24 14:56:01.979006", "delta": "0:00:00.018591", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15980 1727204161.99674: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 15980 1727204161.99728: stderr chunk (state=3): >>><<< 15980 1727204161.99739: stdout chunk (state=3): >>><<< 15980 1727204161.99769: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "LSR-TST-br31 /etc/NetworkManager/system-connections/LSR-TST-br31.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "start": "2024-09-24 14:56:01.960415", "end": "2024-09-24 14:56:01.979006", "delta": "0:00:00.018591", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 15980 1727204161.99872: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204161.627056-17955-35943424659710/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15980 1727204161.99876: _low_level_execute_command(): starting 15980 1727204161.99879: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204161.627056-17955-35943424659710/ > /dev/null 2>&1 && sleep 0' 15980 1727204162.00367: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204162.00371: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204162.00374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204162.00432: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204162.00436: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204162.00440: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204162.00511: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204162.02920: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204162.02972: stderr chunk (state=3): >>><<< 15980 1727204162.03008: stdout chunk (state=3): >>><<< 15980 1727204162.03117: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204162.03121: handler run complete 15980 1727204162.03177: Evaluated conditional (False): False 15980 1727204162.03314: attempt loop complete, returning result 15980 1727204162.03317: _execute() done 15980 1727204162.03319: dumping result to json 15980 1727204162.03359: done dumping result, returning 15980 1727204162.03577: done running TaskExecutor() for managed-node2/TASK: Get NM profile info [127b8e07-fff9-5f1d-4b72-00000000026f] 15980 1727204162.03581: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000026f 15980 1727204162.03754: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000026f 15980 1727204162.03758: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "delta": "0:00:00.018591", "end": "2024-09-24 14:56:01.979006", "rc": 0, "start": "2024-09-24 14:56:01.960415" } STDOUT: LSR-TST-br31 /etc/NetworkManager/system-connections/LSR-TST-br31.nmconnection 15980 1727204162.03854: no more pending results, returning what we have 15980 1727204162.03858: results queue empty 15980 1727204162.03859: checking for any_errors_fatal 15980 1727204162.03868: done checking for any_errors_fatal 15980 1727204162.03869: checking for max_fail_percentage 15980 1727204162.03871: done checking for max_fail_percentage 15980 1727204162.03872: checking to see if all hosts have failed and the running result is not ok 15980 1727204162.03874: done checking to see if all hosts have failed 15980 1727204162.04011: getting the remaining hosts for this loop 15980 1727204162.04014: done getting the remaining hosts for this loop 15980 1727204162.04020: getting the next task for host managed-node2 15980 1727204162.04030: done getting next task for host managed-node2 15980 1727204162.04033: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 15980 1727204162.04038: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204162.04042: getting variables 15980 1727204162.04047: in VariableManager get_vars() 15980 1727204162.04192: Calling all_inventory to load vars for managed-node2 15980 1727204162.04195: Calling groups_inventory to load vars for managed-node2 15980 1727204162.04199: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204162.04338: Calling all_plugins_play to load vars for managed-node2 15980 1727204162.04343: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204162.04348: Calling groups_plugins_play to load vars for managed-node2 15980 1727204162.08879: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204162.12731: done with get_vars() 15980 1727204162.12807: done getting variables 15980 1727204162.12888: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:56:02 -0400 (0:00:00.580) 0:00:23.542 ***** 15980 1727204162.13362: entering _queue_task() for managed-node2/set_fact 15980 1727204162.15017: worker is 1 (out of 1 available) 15980 1727204162.15038: exiting _queue_task() for managed-node2/set_fact 15980 1727204162.15052: done queuing things up, now waiting for results queue to drain 15980 1727204162.15054: waiting for pending results... 15980 1727204162.15438: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 15980 1727204162.15678: in run() - task 127b8e07-fff9-5f1d-4b72-000000000270 15980 1727204162.15684: variable 'ansible_search_path' from source: unknown 15980 1727204162.15692: variable 'ansible_search_path' from source: unknown 15980 1727204162.15874: calling self._execute() 15980 1727204162.15970: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204162.16073: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204162.16098: variable 'omit' from source: magic vars 15980 1727204162.17372: variable 'ansible_distribution_major_version' from source: facts 15980 1727204162.17377: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204162.17380: variable 'nm_profile_exists' from source: set_fact 15980 1727204162.17382: Evaluated conditional (nm_profile_exists.rc == 0): True 15980 1727204162.17384: variable 'omit' from source: magic vars 15980 1727204162.17632: variable 'omit' from source: magic vars 15980 1727204162.17676: variable 'omit' from source: magic vars 15980 1727204162.17799: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204162.17912: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204162.17942: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204162.18003: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204162.18271: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204162.18275: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204162.18278: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204162.18280: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204162.18352: Set connection var ansible_connection to ssh 15980 1727204162.18494: Set connection var ansible_pipelining to False 15980 1727204162.18505: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204162.18516: Set connection var ansible_timeout to 10 15980 1727204162.18527: Set connection var ansible_shell_type to sh 15980 1727204162.18539: Set connection var ansible_shell_executable to /bin/sh 15980 1727204162.18579: variable 'ansible_shell_executable' from source: unknown 15980 1727204162.18588: variable 'ansible_connection' from source: unknown 15980 1727204162.18596: variable 'ansible_module_compression' from source: unknown 15980 1727204162.18602: variable 'ansible_shell_type' from source: unknown 15980 1727204162.18609: variable 'ansible_shell_executable' from source: unknown 15980 1727204162.18616: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204162.18624: variable 'ansible_pipelining' from source: unknown 15980 1727204162.18635: variable 'ansible_timeout' from source: unknown 15980 1727204162.18645: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204162.18874: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15980 1727204162.18893: variable 'omit' from source: magic vars 15980 1727204162.18903: starting attempt loop 15980 1727204162.18909: running the handler 15980 1727204162.18928: handler run complete 15980 1727204162.18944: attempt loop complete, returning result 15980 1727204162.18950: _execute() done 15980 1727204162.18956: dumping result to json 15980 1727204162.18963: done dumping result, returning 15980 1727204162.18984: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [127b8e07-fff9-5f1d-4b72-000000000270] 15980 1727204162.18994: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000270 ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 15980 1727204162.19279: no more pending results, returning what we have 15980 1727204162.19283: results queue empty 15980 1727204162.19284: checking for any_errors_fatal 15980 1727204162.19291: done checking for any_errors_fatal 15980 1727204162.19292: checking for max_fail_percentage 15980 1727204162.19294: done checking for max_fail_percentage 15980 1727204162.19295: checking to see if all hosts have failed and the running result is not ok 15980 1727204162.19296: done checking to see if all hosts have failed 15980 1727204162.19297: getting the remaining hosts for this loop 15980 1727204162.19305: done getting the remaining hosts for this loop 15980 1727204162.19310: getting the next task for host managed-node2 15980 1727204162.19321: done getting next task for host managed-node2 15980 1727204162.19324: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 15980 1727204162.19329: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204162.19335: getting variables 15980 1727204162.19337: in VariableManager get_vars() 15980 1727204162.19476: Calling all_inventory to load vars for managed-node2 15980 1727204162.19480: Calling groups_inventory to load vars for managed-node2 15980 1727204162.19484: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204162.19490: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000270 15980 1727204162.19493: WORKER PROCESS EXITING 15980 1727204162.19505: Calling all_plugins_play to load vars for managed-node2 15980 1727204162.19508: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204162.19511: Calling groups_plugins_play to load vars for managed-node2 15980 1727204162.22864: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204162.25734: done with get_vars() 15980 1727204162.25762: done getting variables 15980 1727204162.25833: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15980 1727204162.25974: variable 'profile' from source: play vars 15980 1727204162.25978: variable 'interface' from source: set_fact 15980 1727204162.26045: variable 'interface' from source: set_fact TASK [Get the ansible_managed comment in ifcfg-LSR-TST-br31] ******************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:56:02 -0400 (0:00:00.128) 0:00:23.671 ***** 15980 1727204162.26088: entering _queue_task() for managed-node2/command 15980 1727204162.27214: worker is 1 (out of 1 available) 15980 1727204162.27233: exiting _queue_task() for managed-node2/command 15980 1727204162.27247: done queuing things up, now waiting for results queue to drain 15980 1727204162.27250: waiting for pending results... 15980 1727204162.27809: running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-LSR-TST-br31 15980 1727204162.27956: in run() - task 127b8e07-fff9-5f1d-4b72-000000000272 15980 1727204162.27984: variable 'ansible_search_path' from source: unknown 15980 1727204162.28000: variable 'ansible_search_path' from source: unknown 15980 1727204162.28050: calling self._execute() 15980 1727204162.28190: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204162.28224: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204162.28271: variable 'omit' from source: magic vars 15980 1727204162.28722: variable 'ansible_distribution_major_version' from source: facts 15980 1727204162.28746: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204162.28915: variable 'profile_stat' from source: set_fact 15980 1727204162.28942: Evaluated conditional (profile_stat.stat.exists): False 15980 1727204162.28993: when evaluation is False, skipping this task 15980 1727204162.28997: _execute() done 15980 1727204162.29000: dumping result to json 15980 1727204162.29002: done dumping result, returning 15980 1727204162.29005: done running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-LSR-TST-br31 [127b8e07-fff9-5f1d-4b72-000000000272] 15980 1727204162.29007: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000272 skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15980 1727204162.29181: no more pending results, returning what we have 15980 1727204162.29186: results queue empty 15980 1727204162.29188: checking for any_errors_fatal 15980 1727204162.29196: done checking for any_errors_fatal 15980 1727204162.29198: checking for max_fail_percentage 15980 1727204162.29200: done checking for max_fail_percentage 15980 1727204162.29202: checking to see if all hosts have failed and the running result is not ok 15980 1727204162.29203: done checking to see if all hosts have failed 15980 1727204162.29204: getting the remaining hosts for this loop 15980 1727204162.29205: done getting the remaining hosts for this loop 15980 1727204162.29212: getting the next task for host managed-node2 15980 1727204162.29220: done getting next task for host managed-node2 15980 1727204162.29222: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 15980 1727204162.29229: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204162.29233: getting variables 15980 1727204162.29235: in VariableManager get_vars() 15980 1727204162.29576: Calling all_inventory to load vars for managed-node2 15980 1727204162.29579: Calling groups_inventory to load vars for managed-node2 15980 1727204162.29583: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204162.29594: Calling all_plugins_play to load vars for managed-node2 15980 1727204162.29596: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204162.29599: Calling groups_plugins_play to load vars for managed-node2 15980 1727204162.30186: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000272 15980 1727204162.30191: WORKER PROCESS EXITING 15980 1727204162.32991: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204162.39396: done with get_vars() 15980 1727204162.39434: done getting variables 15980 1727204162.39506: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15980 1727204162.39635: variable 'profile' from source: play vars 15980 1727204162.39639: variable 'interface' from source: set_fact 15980 1727204162.39708: variable 'interface' from source: set_fact TASK [Verify the ansible_managed comment in ifcfg-LSR-TST-br31] **************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:56:02 -0400 (0:00:00.136) 0:00:23.807 ***** 15980 1727204162.39742: entering _queue_task() for managed-node2/set_fact 15980 1727204162.40159: worker is 1 (out of 1 available) 15980 1727204162.40174: exiting _queue_task() for managed-node2/set_fact 15980 1727204162.40187: done queuing things up, now waiting for results queue to drain 15980 1727204162.40189: waiting for pending results... 15980 1727204162.40556: running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-LSR-TST-br31 15980 1727204162.40759: in run() - task 127b8e07-fff9-5f1d-4b72-000000000273 15980 1727204162.40786: variable 'ansible_search_path' from source: unknown 15980 1727204162.40825: variable 'ansible_search_path' from source: unknown 15980 1727204162.40871: calling self._execute() 15980 1727204162.41021: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204162.41042: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204162.41058: variable 'omit' from source: magic vars 15980 1727204162.41710: variable 'ansible_distribution_major_version' from source: facts 15980 1727204162.41729: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204162.41872: variable 'profile_stat' from source: set_fact 15980 1727204162.42030: Evaluated conditional (profile_stat.stat.exists): False 15980 1727204162.42034: when evaluation is False, skipping this task 15980 1727204162.42036: _execute() done 15980 1727204162.42039: dumping result to json 15980 1727204162.42042: done dumping result, returning 15980 1727204162.42044: done running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-LSR-TST-br31 [127b8e07-fff9-5f1d-4b72-000000000273] 15980 1727204162.42047: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000273 15980 1727204162.42139: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000273 15980 1727204162.42143: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15980 1727204162.42210: no more pending results, returning what we have 15980 1727204162.42218: results queue empty 15980 1727204162.42219: checking for any_errors_fatal 15980 1727204162.42228: done checking for any_errors_fatal 15980 1727204162.42229: checking for max_fail_percentage 15980 1727204162.42231: done checking for max_fail_percentage 15980 1727204162.42233: checking to see if all hosts have failed and the running result is not ok 15980 1727204162.42234: done checking to see if all hosts have failed 15980 1727204162.42235: getting the remaining hosts for this loop 15980 1727204162.42237: done getting the remaining hosts for this loop 15980 1727204162.42245: getting the next task for host managed-node2 15980 1727204162.42253: done getting next task for host managed-node2 15980 1727204162.42256: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 15980 1727204162.42264: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204162.42274: getting variables 15980 1727204162.42276: in VariableManager get_vars() 15980 1727204162.42313: Calling all_inventory to load vars for managed-node2 15980 1727204162.42316: Calling groups_inventory to load vars for managed-node2 15980 1727204162.42321: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204162.42338: Calling all_plugins_play to load vars for managed-node2 15980 1727204162.42341: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204162.42345: Calling groups_plugins_play to load vars for managed-node2 15980 1727204162.45901: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204162.48334: done with get_vars() 15980 1727204162.48374: done getting variables 15980 1727204162.48454: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15980 1727204162.48593: variable 'profile' from source: play vars 15980 1727204162.48598: variable 'interface' from source: set_fact 15980 1727204162.48671: variable 'interface' from source: set_fact TASK [Get the fingerprint comment in ifcfg-LSR-TST-br31] *********************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:56:02 -0400 (0:00:00.089) 0:00:23.897 ***** 15980 1727204162.48705: entering _queue_task() for managed-node2/command 15980 1727204162.49128: worker is 1 (out of 1 available) 15980 1727204162.49142: exiting _queue_task() for managed-node2/command 15980 1727204162.49154: done queuing things up, now waiting for results queue to drain 15980 1727204162.49156: waiting for pending results... 15980 1727204162.49591: running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-LSR-TST-br31 15980 1727204162.49688: in run() - task 127b8e07-fff9-5f1d-4b72-000000000274 15980 1727204162.49693: variable 'ansible_search_path' from source: unknown 15980 1727204162.49696: variable 'ansible_search_path' from source: unknown 15980 1727204162.49699: calling self._execute() 15980 1727204162.49796: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204162.49810: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204162.49825: variable 'omit' from source: magic vars 15980 1727204162.50263: variable 'ansible_distribution_major_version' from source: facts 15980 1727204162.50286: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204162.50435: variable 'profile_stat' from source: set_fact 15980 1727204162.50559: Evaluated conditional (profile_stat.stat.exists): False 15980 1727204162.50563: when evaluation is False, skipping this task 15980 1727204162.50573: _execute() done 15980 1727204162.50576: dumping result to json 15980 1727204162.50578: done dumping result, returning 15980 1727204162.50582: done running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-LSR-TST-br31 [127b8e07-fff9-5f1d-4b72-000000000274] 15980 1727204162.50584: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000274 15980 1727204162.50867: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000274 15980 1727204162.50873: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15980 1727204162.50925: no more pending results, returning what we have 15980 1727204162.50929: results queue empty 15980 1727204162.50930: checking for any_errors_fatal 15980 1727204162.50935: done checking for any_errors_fatal 15980 1727204162.50936: checking for max_fail_percentage 15980 1727204162.50938: done checking for max_fail_percentage 15980 1727204162.50939: checking to see if all hosts have failed and the running result is not ok 15980 1727204162.50940: done checking to see if all hosts have failed 15980 1727204162.50941: getting the remaining hosts for this loop 15980 1727204162.50942: done getting the remaining hosts for this loop 15980 1727204162.50946: getting the next task for host managed-node2 15980 1727204162.50954: done getting next task for host managed-node2 15980 1727204162.50957: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 15980 1727204162.50961: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204162.50968: getting variables 15980 1727204162.50970: in VariableManager get_vars() 15980 1727204162.51005: Calling all_inventory to load vars for managed-node2 15980 1727204162.51008: Calling groups_inventory to load vars for managed-node2 15980 1727204162.51012: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204162.51026: Calling all_plugins_play to load vars for managed-node2 15980 1727204162.51030: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204162.51034: Calling groups_plugins_play to load vars for managed-node2 15980 1727204162.52913: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204162.56004: done with get_vars() 15980 1727204162.56042: done getting variables 15980 1727204162.56118: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15980 1727204162.56253: variable 'profile' from source: play vars 15980 1727204162.56258: variable 'interface' from source: set_fact 15980 1727204162.56328: variable 'interface' from source: set_fact TASK [Verify the fingerprint comment in ifcfg-LSR-TST-br31] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:56:02 -0400 (0:00:00.076) 0:00:23.974 ***** 15980 1727204162.56379: entering _queue_task() for managed-node2/set_fact 15980 1727204162.57092: worker is 1 (out of 1 available) 15980 1727204162.57108: exiting _queue_task() for managed-node2/set_fact 15980 1727204162.57121: done queuing things up, now waiting for results queue to drain 15980 1727204162.57123: waiting for pending results... 15980 1727204162.57886: running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-LSR-TST-br31 15980 1727204162.58301: in run() - task 127b8e07-fff9-5f1d-4b72-000000000275 15980 1727204162.58315: variable 'ansible_search_path' from source: unknown 15980 1727204162.58319: variable 'ansible_search_path' from source: unknown 15980 1727204162.58362: calling self._execute() 15980 1727204162.58467: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204162.58821: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204162.58835: variable 'omit' from source: magic vars 15980 1727204162.59648: variable 'ansible_distribution_major_version' from source: facts 15980 1727204162.59663: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204162.60125: variable 'profile_stat' from source: set_fact 15980 1727204162.60147: Evaluated conditional (profile_stat.stat.exists): False 15980 1727204162.60151: when evaluation is False, skipping this task 15980 1727204162.60153: _execute() done 15980 1727204162.60158: dumping result to json 15980 1727204162.60160: done dumping result, returning 15980 1727204162.60169: done running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-LSR-TST-br31 [127b8e07-fff9-5f1d-4b72-000000000275] 15980 1727204162.60175: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000275 15980 1727204162.60289: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000275 15980 1727204162.60292: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15980 1727204162.60376: no more pending results, returning what we have 15980 1727204162.60381: results queue empty 15980 1727204162.60382: checking for any_errors_fatal 15980 1727204162.60388: done checking for any_errors_fatal 15980 1727204162.60389: checking for max_fail_percentage 15980 1727204162.60390: done checking for max_fail_percentage 15980 1727204162.60391: checking to see if all hosts have failed and the running result is not ok 15980 1727204162.60392: done checking to see if all hosts have failed 15980 1727204162.60393: getting the remaining hosts for this loop 15980 1727204162.60394: done getting the remaining hosts for this loop 15980 1727204162.60399: getting the next task for host managed-node2 15980 1727204162.60407: done getting next task for host managed-node2 15980 1727204162.60411: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 15980 1727204162.60414: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204162.60419: getting variables 15980 1727204162.60420: in VariableManager get_vars() 15980 1727204162.60453: Calling all_inventory to load vars for managed-node2 15980 1727204162.60456: Calling groups_inventory to load vars for managed-node2 15980 1727204162.60460: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204162.60478: Calling all_plugins_play to load vars for managed-node2 15980 1727204162.60481: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204162.60484: Calling groups_plugins_play to load vars for managed-node2 15980 1727204162.63068: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204162.65248: done with get_vars() 15980 1727204162.65292: done getting variables 15980 1727204162.65361: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15980 1727204162.65500: variable 'profile' from source: play vars 15980 1727204162.65505: variable 'interface' from source: set_fact 15980 1727204162.65565: variable 'interface' from source: set_fact TASK [Assert that the profile is present - 'LSR-TST-br31'] ********************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Tuesday 24 September 2024 14:56:02 -0400 (0:00:00.092) 0:00:24.066 ***** 15980 1727204162.65604: entering _queue_task() for managed-node2/assert 15980 1727204162.66044: worker is 1 (out of 1 available) 15980 1727204162.66058: exiting _queue_task() for managed-node2/assert 15980 1727204162.66073: done queuing things up, now waiting for results queue to drain 15980 1727204162.66075: waiting for pending results... 15980 1727204162.66476: running TaskExecutor() for managed-node2/TASK: Assert that the profile is present - 'LSR-TST-br31' 15980 1727204162.66481: in run() - task 127b8e07-fff9-5f1d-4b72-000000000260 15980 1727204162.66486: variable 'ansible_search_path' from source: unknown 15980 1727204162.66493: variable 'ansible_search_path' from source: unknown 15980 1727204162.66535: calling self._execute() 15980 1727204162.66642: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204162.66690: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204162.66772: variable 'omit' from source: magic vars 15980 1727204162.67186: variable 'ansible_distribution_major_version' from source: facts 15980 1727204162.67206: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204162.67229: variable 'omit' from source: magic vars 15980 1727204162.67283: variable 'omit' from source: magic vars 15980 1727204162.67413: variable 'profile' from source: play vars 15980 1727204162.67424: variable 'interface' from source: set_fact 15980 1727204162.67507: variable 'interface' from source: set_fact 15980 1727204162.67535: variable 'omit' from source: magic vars 15980 1727204162.67600: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204162.67649: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204162.67696: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204162.67719: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204162.67775: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204162.67783: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204162.67792: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204162.67800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204162.67912: Set connection var ansible_connection to ssh 15980 1727204162.67927: Set connection var ansible_pipelining to False 15980 1727204162.67940: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204162.67970: Set connection var ansible_timeout to 10 15980 1727204162.67973: Set connection var ansible_shell_type to sh 15980 1727204162.67976: Set connection var ansible_shell_executable to /bin/sh 15980 1727204162.68014: variable 'ansible_shell_executable' from source: unknown 15980 1727204162.68102: variable 'ansible_connection' from source: unknown 15980 1727204162.68104: variable 'ansible_module_compression' from source: unknown 15980 1727204162.68106: variable 'ansible_shell_type' from source: unknown 15980 1727204162.68108: variable 'ansible_shell_executable' from source: unknown 15980 1727204162.68110: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204162.68112: variable 'ansible_pipelining' from source: unknown 15980 1727204162.68114: variable 'ansible_timeout' from source: unknown 15980 1727204162.68115: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204162.68215: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15980 1727204162.68233: variable 'omit' from source: magic vars 15980 1727204162.68242: starting attempt loop 15980 1727204162.68247: running the handler 15980 1727204162.68383: variable 'lsr_net_profile_exists' from source: set_fact 15980 1727204162.68394: Evaluated conditional (lsr_net_profile_exists): True 15980 1727204162.68403: handler run complete 15980 1727204162.68434: attempt loop complete, returning result 15980 1727204162.68442: _execute() done 15980 1727204162.68448: dumping result to json 15980 1727204162.68455: done dumping result, returning 15980 1727204162.68470: done running TaskExecutor() for managed-node2/TASK: Assert that the profile is present - 'LSR-TST-br31' [127b8e07-fff9-5f1d-4b72-000000000260] 15980 1727204162.68480: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000260 15980 1727204162.68871: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000260 15980 1727204162.68875: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 15980 1727204162.68924: no more pending results, returning what we have 15980 1727204162.68927: results queue empty 15980 1727204162.68928: checking for any_errors_fatal 15980 1727204162.68933: done checking for any_errors_fatal 15980 1727204162.68934: checking for max_fail_percentage 15980 1727204162.68936: done checking for max_fail_percentage 15980 1727204162.68937: checking to see if all hosts have failed and the running result is not ok 15980 1727204162.68938: done checking to see if all hosts have failed 15980 1727204162.68939: getting the remaining hosts for this loop 15980 1727204162.68940: done getting the remaining hosts for this loop 15980 1727204162.68945: getting the next task for host managed-node2 15980 1727204162.68950: done getting next task for host managed-node2 15980 1727204162.68953: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 15980 1727204162.68956: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204162.68959: getting variables 15980 1727204162.68961: in VariableManager get_vars() 15980 1727204162.68993: Calling all_inventory to load vars for managed-node2 15980 1727204162.68996: Calling groups_inventory to load vars for managed-node2 15980 1727204162.68999: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204162.69011: Calling all_plugins_play to load vars for managed-node2 15980 1727204162.69013: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204162.69017: Calling groups_plugins_play to load vars for managed-node2 15980 1727204162.76189: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204162.78395: done with get_vars() 15980 1727204162.78438: done getting variables 15980 1727204162.78497: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15980 1727204162.78611: variable 'profile' from source: play vars 15980 1727204162.78615: variable 'interface' from source: set_fact 15980 1727204162.78683: variable 'interface' from source: set_fact TASK [Assert that the ansible managed comment is present in 'LSR-TST-br31'] **** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Tuesday 24 September 2024 14:56:02 -0400 (0:00:00.131) 0:00:24.197 ***** 15980 1727204162.78717: entering _queue_task() for managed-node2/assert 15980 1727204162.79105: worker is 1 (out of 1 available) 15980 1727204162.79118: exiting _queue_task() for managed-node2/assert 15980 1727204162.79132: done queuing things up, now waiting for results queue to drain 15980 1727204162.79135: waiting for pending results... 15980 1727204162.79447: running TaskExecutor() for managed-node2/TASK: Assert that the ansible managed comment is present in 'LSR-TST-br31' 15980 1727204162.79893: in run() - task 127b8e07-fff9-5f1d-4b72-000000000261 15980 1727204162.79915: variable 'ansible_search_path' from source: unknown 15980 1727204162.79925: variable 'ansible_search_path' from source: unknown 15980 1727204162.79976: calling self._execute() 15980 1727204162.80087: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204162.80150: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204162.80155: variable 'omit' from source: magic vars 15980 1727204162.80999: variable 'ansible_distribution_major_version' from source: facts 15980 1727204162.81020: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204162.81035: variable 'omit' from source: magic vars 15980 1727204162.81096: variable 'omit' from source: magic vars 15980 1727204162.81247: variable 'profile' from source: play vars 15980 1727204162.81299: variable 'interface' from source: set_fact 15980 1727204162.81615: variable 'interface' from source: set_fact 15980 1727204162.81712: variable 'omit' from source: magic vars 15980 1727204162.81937: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204162.82025: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204162.82084: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204162.82117: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204162.82178: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204162.82335: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204162.82338: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204162.82342: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204162.82789: Set connection var ansible_connection to ssh 15980 1727204162.82793: Set connection var ansible_pipelining to False 15980 1727204162.82795: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204162.82797: Set connection var ansible_timeout to 10 15980 1727204162.82800: Set connection var ansible_shell_type to sh 15980 1727204162.82803: Set connection var ansible_shell_executable to /bin/sh 15980 1727204162.82805: variable 'ansible_shell_executable' from source: unknown 15980 1727204162.82807: variable 'ansible_connection' from source: unknown 15980 1727204162.82810: variable 'ansible_module_compression' from source: unknown 15980 1727204162.82812: variable 'ansible_shell_type' from source: unknown 15980 1727204162.82815: variable 'ansible_shell_executable' from source: unknown 15980 1727204162.82817: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204162.82819: variable 'ansible_pipelining' from source: unknown 15980 1727204162.82822: variable 'ansible_timeout' from source: unknown 15980 1727204162.82825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204162.82935: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15980 1727204162.82953: variable 'omit' from source: magic vars 15980 1727204162.82968: starting attempt loop 15980 1727204162.82975: running the handler 15980 1727204162.83173: variable 'lsr_net_profile_ansible_managed' from source: set_fact 15980 1727204162.83573: Evaluated conditional (lsr_net_profile_ansible_managed): True 15980 1727204162.83577: handler run complete 15980 1727204162.83580: attempt loop complete, returning result 15980 1727204162.83582: _execute() done 15980 1727204162.83584: dumping result to json 15980 1727204162.83587: done dumping result, returning 15980 1727204162.83589: done running TaskExecutor() for managed-node2/TASK: Assert that the ansible managed comment is present in 'LSR-TST-br31' [127b8e07-fff9-5f1d-4b72-000000000261] 15980 1727204162.83591: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000261 15980 1727204162.83679: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000261 ok: [managed-node2] => { "changed": false } MSG: All assertions passed 15980 1727204162.83736: no more pending results, returning what we have 15980 1727204162.83740: results queue empty 15980 1727204162.83741: checking for any_errors_fatal 15980 1727204162.83751: done checking for any_errors_fatal 15980 1727204162.83752: checking for max_fail_percentage 15980 1727204162.83753: done checking for max_fail_percentage 15980 1727204162.83755: checking to see if all hosts have failed and the running result is not ok 15980 1727204162.83756: done checking to see if all hosts have failed 15980 1727204162.83757: getting the remaining hosts for this loop 15980 1727204162.83759: done getting the remaining hosts for this loop 15980 1727204162.83764: getting the next task for host managed-node2 15980 1727204162.83774: done getting next task for host managed-node2 15980 1727204162.83777: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 15980 1727204162.83780: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204162.83785: getting variables 15980 1727204162.83787: in VariableManager get_vars() 15980 1727204162.83821: Calling all_inventory to load vars for managed-node2 15980 1727204162.83824: Calling groups_inventory to load vars for managed-node2 15980 1727204162.83828: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204162.83843: Calling all_plugins_play to load vars for managed-node2 15980 1727204162.83846: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204162.83850: Calling groups_plugins_play to load vars for managed-node2 15980 1727204162.84975: WORKER PROCESS EXITING 15980 1727204162.88394: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204162.92854: done with get_vars() 15980 1727204162.93035: done getting variables 15980 1727204162.93104: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15980 1727204162.93452: variable 'profile' from source: play vars 15980 1727204162.93457: variable 'interface' from source: set_fact 15980 1727204162.93527: variable 'interface' from source: set_fact TASK [Assert that the fingerprint comment is present in LSR-TST-br31] ********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Tuesday 24 September 2024 14:56:02 -0400 (0:00:00.149) 0:00:24.346 ***** 15980 1727204162.93657: entering _queue_task() for managed-node2/assert 15980 1727204162.94198: worker is 1 (out of 1 available) 15980 1727204162.94214: exiting _queue_task() for managed-node2/assert 15980 1727204162.94226: done queuing things up, now waiting for results queue to drain 15980 1727204162.94228: waiting for pending results... 15980 1727204162.94592: running TaskExecutor() for managed-node2/TASK: Assert that the fingerprint comment is present in LSR-TST-br31 15980 1727204162.94725: in run() - task 127b8e07-fff9-5f1d-4b72-000000000262 15980 1727204162.94750: variable 'ansible_search_path' from source: unknown 15980 1727204162.94758: variable 'ansible_search_path' from source: unknown 15980 1727204162.94819: calling self._execute() 15980 1727204162.94944: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204162.95050: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204162.95070: variable 'omit' from source: magic vars 15980 1727204162.96535: variable 'ansible_distribution_major_version' from source: facts 15980 1727204162.96539: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204162.96543: variable 'omit' from source: magic vars 15980 1727204162.96557: variable 'omit' from source: magic vars 15980 1727204162.96726: variable 'profile' from source: play vars 15980 1727204162.96811: variable 'interface' from source: set_fact 15980 1727204162.96935: variable 'interface' from source: set_fact 15980 1727204162.97042: variable 'omit' from source: magic vars 15980 1727204162.97098: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204162.97168: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204162.97303: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204162.97406: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204162.97424: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204162.97488: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204162.97572: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204162.97582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204162.97810: Set connection var ansible_connection to ssh 15980 1727204162.97825: Set connection var ansible_pipelining to False 15980 1727204162.98163: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204162.98168: Set connection var ansible_timeout to 10 15980 1727204162.98171: Set connection var ansible_shell_type to sh 15980 1727204162.98174: Set connection var ansible_shell_executable to /bin/sh 15980 1727204162.98176: variable 'ansible_shell_executable' from source: unknown 15980 1727204162.98178: variable 'ansible_connection' from source: unknown 15980 1727204162.98180: variable 'ansible_module_compression' from source: unknown 15980 1727204162.98181: variable 'ansible_shell_type' from source: unknown 15980 1727204162.98183: variable 'ansible_shell_executable' from source: unknown 15980 1727204162.98185: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204162.98187: variable 'ansible_pipelining' from source: unknown 15980 1727204162.98189: variable 'ansible_timeout' from source: unknown 15980 1727204162.98192: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204162.98487: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15980 1727204162.98561: variable 'omit' from source: magic vars 15980 1727204162.98578: starting attempt loop 15980 1727204162.98586: running the handler 15980 1727204162.98845: variable 'lsr_net_profile_fingerprint' from source: set_fact 15980 1727204162.99083: Evaluated conditional (lsr_net_profile_fingerprint): True 15980 1727204162.99192: handler run complete 15980 1727204162.99195: attempt loop complete, returning result 15980 1727204162.99198: _execute() done 15980 1727204162.99200: dumping result to json 15980 1727204162.99205: done dumping result, returning 15980 1727204162.99208: done running TaskExecutor() for managed-node2/TASK: Assert that the fingerprint comment is present in LSR-TST-br31 [127b8e07-fff9-5f1d-4b72-000000000262] 15980 1727204162.99211: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000262 15980 1727204162.99289: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000262 15980 1727204162.99292: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 15980 1727204162.99423: no more pending results, returning what we have 15980 1727204162.99427: results queue empty 15980 1727204162.99428: checking for any_errors_fatal 15980 1727204162.99438: done checking for any_errors_fatal 15980 1727204162.99439: checking for max_fail_percentage 15980 1727204162.99441: done checking for max_fail_percentage 15980 1727204162.99442: checking to see if all hosts have failed and the running result is not ok 15980 1727204162.99443: done checking to see if all hosts have failed 15980 1727204162.99444: getting the remaining hosts for this loop 15980 1727204162.99446: done getting the remaining hosts for this loop 15980 1727204162.99451: getting the next task for host managed-node2 15980 1727204162.99459: done getting next task for host managed-node2 15980 1727204162.99462: ^ task is: TASK: meta (flush_handlers) 15980 1727204162.99464: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204162.99472: getting variables 15980 1727204162.99474: in VariableManager get_vars() 15980 1727204162.99506: Calling all_inventory to load vars for managed-node2 15980 1727204162.99510: Calling groups_inventory to load vars for managed-node2 15980 1727204162.99515: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204162.99529: Calling all_plugins_play to load vars for managed-node2 15980 1727204162.99532: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204162.99535: Calling groups_plugins_play to load vars for managed-node2 15980 1727204163.04064: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204163.08321: done with get_vars() 15980 1727204163.08370: done getting variables 15980 1727204163.08461: in VariableManager get_vars() 15980 1727204163.08478: Calling all_inventory to load vars for managed-node2 15980 1727204163.08481: Calling groups_inventory to load vars for managed-node2 15980 1727204163.08484: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204163.08490: Calling all_plugins_play to load vars for managed-node2 15980 1727204163.08492: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204163.08496: Calling groups_plugins_play to load vars for managed-node2 15980 1727204163.10198: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204163.14047: done with get_vars() 15980 1727204163.14095: done queuing things up, now waiting for results queue to drain 15980 1727204163.14097: results queue empty 15980 1727204163.14098: checking for any_errors_fatal 15980 1727204163.14101: done checking for any_errors_fatal 15980 1727204163.14102: checking for max_fail_percentage 15980 1727204163.14103: done checking for max_fail_percentage 15980 1727204163.14110: checking to see if all hosts have failed and the running result is not ok 15980 1727204163.14111: done checking to see if all hosts have failed 15980 1727204163.14112: getting the remaining hosts for this loop 15980 1727204163.14113: done getting the remaining hosts for this loop 15980 1727204163.14116: getting the next task for host managed-node2 15980 1727204163.14120: done getting next task for host managed-node2 15980 1727204163.14122: ^ task is: TASK: meta (flush_handlers) 15980 1727204163.14123: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204163.14126: getting variables 15980 1727204163.14127: in VariableManager get_vars() 15980 1727204163.14137: Calling all_inventory to load vars for managed-node2 15980 1727204163.14139: Calling groups_inventory to load vars for managed-node2 15980 1727204163.14142: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204163.14149: Calling all_plugins_play to load vars for managed-node2 15980 1727204163.14151: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204163.14154: Calling groups_plugins_play to load vars for managed-node2 15980 1727204163.17497: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204163.21503: done with get_vars() 15980 1727204163.21540: done getting variables 15980 1727204163.21613: in VariableManager get_vars() 15980 1727204163.21628: Calling all_inventory to load vars for managed-node2 15980 1727204163.21631: Calling groups_inventory to load vars for managed-node2 15980 1727204163.21634: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204163.21640: Calling all_plugins_play to load vars for managed-node2 15980 1727204163.21643: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204163.21646: Calling groups_plugins_play to load vars for managed-node2 15980 1727204163.23773: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204163.26243: done with get_vars() 15980 1727204163.26289: done queuing things up, now waiting for results queue to drain 15980 1727204163.26291: results queue empty 15980 1727204163.26292: checking for any_errors_fatal 15980 1727204163.26294: done checking for any_errors_fatal 15980 1727204163.26294: checking for max_fail_percentage 15980 1727204163.26296: done checking for max_fail_percentage 15980 1727204163.26296: checking to see if all hosts have failed and the running result is not ok 15980 1727204163.26298: done checking to see if all hosts have failed 15980 1727204163.26298: getting the remaining hosts for this loop 15980 1727204163.26299: done getting the remaining hosts for this loop 15980 1727204163.26302: getting the next task for host managed-node2 15980 1727204163.26306: done getting next task for host managed-node2 15980 1727204163.26307: ^ task is: None 15980 1727204163.26308: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204163.26310: done queuing things up, now waiting for results queue to drain 15980 1727204163.26311: results queue empty 15980 1727204163.26311: checking for any_errors_fatal 15980 1727204163.26312: done checking for any_errors_fatal 15980 1727204163.26313: checking for max_fail_percentage 15980 1727204163.26314: done checking for max_fail_percentage 15980 1727204163.26314: checking to see if all hosts have failed and the running result is not ok 15980 1727204163.26315: done checking to see if all hosts have failed 15980 1727204163.26316: getting the next task for host managed-node2 15980 1727204163.26318: done getting next task for host managed-node2 15980 1727204163.26319: ^ task is: None 15980 1727204163.26320: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204163.26418: in VariableManager get_vars() 15980 1727204163.26446: done with get_vars() 15980 1727204163.26459: in VariableManager get_vars() 15980 1727204163.26478: done with get_vars() 15980 1727204163.26484: variable 'omit' from source: magic vars 15980 1727204163.26616: variable 'profile' from source: play vars 15980 1727204163.26747: in VariableManager get_vars() 15980 1727204163.26763: done with get_vars() 15980 1727204163.26796: variable 'omit' from source: magic vars 15980 1727204163.26876: variable 'profile' from source: play vars PLAY [Set down {{ profile }}] ************************************************** 15980 1727204163.27718: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15980 1727204163.27885: getting the remaining hosts for this loop 15980 1727204163.27887: done getting the remaining hosts for this loop 15980 1727204163.27890: getting the next task for host managed-node2 15980 1727204163.27894: done getting next task for host managed-node2 15980 1727204163.27896: ^ task is: TASK: Gathering Facts 15980 1727204163.27898: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204163.27900: getting variables 15980 1727204163.27901: in VariableManager get_vars() 15980 1727204163.28024: Calling all_inventory to load vars for managed-node2 15980 1727204163.28029: Calling groups_inventory to load vars for managed-node2 15980 1727204163.28032: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204163.28039: Calling all_plugins_play to load vars for managed-node2 15980 1727204163.28041: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204163.28044: Calling groups_plugins_play to load vars for managed-node2 15980 1727204163.30336: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204163.33812: done with get_vars() 15980 1727204163.33854: done getting variables 15980 1727204163.33907: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Tuesday 24 September 2024 14:56:03 -0400 (0:00:00.402) 0:00:24.749 ***** 15980 1727204163.33935: entering _queue_task() for managed-node2/gather_facts 15980 1727204163.34428: worker is 1 (out of 1 available) 15980 1727204163.34440: exiting _queue_task() for managed-node2/gather_facts 15980 1727204163.34452: done queuing things up, now waiting for results queue to drain 15980 1727204163.34454: waiting for pending results... 15980 1727204163.34689: running TaskExecutor() for managed-node2/TASK: Gathering Facts 15980 1727204163.34837: in run() - task 127b8e07-fff9-5f1d-4b72-0000000002b5 15980 1727204163.34842: variable 'ansible_search_path' from source: unknown 15980 1727204163.34895: calling self._execute() 15980 1727204163.35164: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204163.35170: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204163.35173: variable 'omit' from source: magic vars 15980 1727204163.35525: variable 'ansible_distribution_major_version' from source: facts 15980 1727204163.35547: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204163.35561: variable 'omit' from source: magic vars 15980 1727204163.35606: variable 'omit' from source: magic vars 15980 1727204163.35657: variable 'omit' from source: magic vars 15980 1727204163.35715: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204163.35764: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204163.35796: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204163.35826: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204163.35850: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204163.35928: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204163.35931: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204163.35934: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204163.36022: Set connection var ansible_connection to ssh 15980 1727204163.36041: Set connection var ansible_pipelining to False 15980 1727204163.36057: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204163.36072: Set connection var ansible_timeout to 10 15980 1727204163.36082: Set connection var ansible_shell_type to sh 15980 1727204163.36143: Set connection var ansible_shell_executable to /bin/sh 15980 1727204163.36147: variable 'ansible_shell_executable' from source: unknown 15980 1727204163.36149: variable 'ansible_connection' from source: unknown 15980 1727204163.36151: variable 'ansible_module_compression' from source: unknown 15980 1727204163.36155: variable 'ansible_shell_type' from source: unknown 15980 1727204163.36158: variable 'ansible_shell_executable' from source: unknown 15980 1727204163.36160: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204163.36162: variable 'ansible_pipelining' from source: unknown 15980 1727204163.36254: variable 'ansible_timeout' from source: unknown 15980 1727204163.36257: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204163.36392: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15980 1727204163.36410: variable 'omit' from source: magic vars 15980 1727204163.36421: starting attempt loop 15980 1727204163.36427: running the handler 15980 1727204163.36448: variable 'ansible_facts' from source: unknown 15980 1727204163.36477: _low_level_execute_command(): starting 15980 1727204163.36493: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15980 1727204163.37617: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204163.37622: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204163.37625: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204163.37651: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204163.37763: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204163.39590: stdout chunk (state=3): >>>/root <<< 15980 1727204163.39771: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204163.40072: stdout chunk (state=3): >>><<< 15980 1727204163.40077: stderr chunk (state=3): >>><<< 15980 1727204163.40081: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204163.40084: _low_level_execute_command(): starting 15980 1727204163.40087: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204163.3982606-18192-229155118394826 `" && echo ansible-tmp-1727204163.3982606-18192-229155118394826="` echo /root/.ansible/tmp/ansible-tmp-1727204163.3982606-18192-229155118394826 `" ) && sleep 0' 15980 1727204163.41207: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204163.41581: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204163.41656: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204163.41701: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204163.41885: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204163.43895: stdout chunk (state=3): >>>ansible-tmp-1727204163.3982606-18192-229155118394826=/root/.ansible/tmp/ansible-tmp-1727204163.3982606-18192-229155118394826 <<< 15980 1727204163.44374: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204163.44379: stdout chunk (state=3): >>><<< 15980 1727204163.44381: stderr chunk (state=3): >>><<< 15980 1727204163.44384: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204163.3982606-18192-229155118394826=/root/.ansible/tmp/ansible-tmp-1727204163.3982606-18192-229155118394826 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204163.44388: variable 'ansible_module_compression' from source: unknown 15980 1727204163.44390: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15980vtkmnsww/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15980 1727204163.44774: variable 'ansible_facts' from source: unknown 15980 1727204163.45104: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204163.3982606-18192-229155118394826/AnsiballZ_setup.py 15980 1727204163.45496: Sending initial data 15980 1727204163.45509: Sent initial data (154 bytes) 15980 1727204163.46556: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204163.46687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204163.46749: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204163.46763: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204163.46875: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204163.48491: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 15980 1727204163.48517: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15980 1727204163.48614: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15980 1727204163.48708: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15980vtkmnsww/tmp44qidbvc /root/.ansible/tmp/ansible-tmp-1727204163.3982606-18192-229155118394826/AnsiballZ_setup.py <<< 15980 1727204163.48718: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204163.3982606-18192-229155118394826/AnsiballZ_setup.py" <<< 15980 1727204163.48810: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15980vtkmnsww/tmp44qidbvc" to remote "/root/.ansible/tmp/ansible-tmp-1727204163.3982606-18192-229155118394826/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204163.3982606-18192-229155118394826/AnsiballZ_setup.py" <<< 15980 1727204163.50728: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204163.50877: stderr chunk (state=3): >>><<< 15980 1727204163.50889: stdout chunk (state=3): >>><<< 15980 1727204163.50937: done transferring module to remote 15980 1727204163.50958: _low_level_execute_command(): starting 15980 1727204163.50970: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204163.3982606-18192-229155118394826/ /root/.ansible/tmp/ansible-tmp-1727204163.3982606-18192-229155118394826/AnsiballZ_setup.py && sleep 0' 15980 1727204163.52153: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204163.52199: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204163.52217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204163.52246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15980 1727204163.52353: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204163.52380: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204163.52490: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204163.54409: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204163.54444: stdout chunk (state=3): >>><<< 15980 1727204163.54448: stderr chunk (state=3): >>><<< 15980 1727204163.54464: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204163.54498: _low_level_execute_command(): starting 15980 1727204163.54502: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204163.3982606-18192-229155118394826/AnsiballZ_setup.py && sleep 0' 15980 1727204163.55309: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204163.55324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204163.55387: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204163.55418: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204163.55484: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204163.55671: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204164.21015: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_fibre_channel_wwn": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fips": false, "ansible_is_chroot": false, "ansible_loadavg": {"1m": 0.79541015625, "5m": 0.5361328125, "15m": 0.26318359375}, "ansible_apparmor": {"status": "disabled"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,11<<< 15980 1727204164.21048: stdout chunk (state=3): >>>5200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_iscsi_iqn": "", "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "03", "epoch": "1727204163", "epoch_int": "1727204163", "date": "2024-09-24", "time": "14:56:03", "iso8601_micro": "2024-09-24T18:56:03.858819Z", "iso8601": "2024-09-24T18:56:03Z", "iso8601_basic": "20240924T145603858819", "iso8601_basic_short": "20240924T145603", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3055, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 661, "free": 3055}, "nocache": {"free": 3485, "used": 231}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_uuid": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 510, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251325673472, "block_size": 4096, "block_total": 64479564, "block_available": 61358807, "block_used": 3120757, "inode_total": 16384000, "inode_available": 16301509, "inode_used": 82491, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_interfaces": ["LSR-TST-br31", "lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::f7:13ff:fe22:8fc1", "prefix": "64", "scope": "link"}]}, "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "42:c8:ed:20:3e:1e", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.73"], "ansible_all_ipv6_addresses": ["fe80::f7:13ff:fe22:8fc1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.73", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::f7:13ff:fe22:8fc1"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15980 1727204164.23043: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 15980 1727204164.23104: stderr chunk (state=3): >>><<< 15980 1727204164.23108: stdout chunk (state=3): >>><<< 15980 1727204164.23133: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_fibre_channel_wwn": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fips": false, "ansible_is_chroot": false, "ansible_loadavg": {"1m": 0.79541015625, "5m": 0.5361328125, "15m": 0.26318359375}, "ansible_apparmor": {"status": "disabled"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_iscsi_iqn": "", "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "03", "epoch": "1727204163", "epoch_int": "1727204163", "date": "2024-09-24", "time": "14:56:03", "iso8601_micro": "2024-09-24T18:56:03.858819Z", "iso8601": "2024-09-24T18:56:03Z", "iso8601_basic": "20240924T145603858819", "iso8601_basic_short": "20240924T145603", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3055, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 661, "free": 3055}, "nocache": {"free": 3485, "used": 231}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_uuid": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 510, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251325673472, "block_size": 4096, "block_total": 64479564, "block_available": 61358807, "block_used": 3120757, "inode_total": 16384000, "inode_available": 16301509, "inode_used": 82491, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_interfaces": ["LSR-TST-br31", "lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::f7:13ff:fe22:8fc1", "prefix": "64", "scope": "link"}]}, "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "42:c8:ed:20:3e:1e", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.73"], "ansible_all_ipv6_addresses": ["fe80::f7:13ff:fe22:8fc1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.73", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::f7:13ff:fe22:8fc1"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 15980 1727204164.23361: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204163.3982606-18192-229155118394826/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15980 1727204164.23374: _low_level_execute_command(): starting 15980 1727204164.23380: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204163.3982606-18192-229155118394826/ > /dev/null 2>&1 && sleep 0' 15980 1727204164.23872: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204164.23877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204164.23882: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204164.23894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204164.23956: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204164.23960: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204164.23963: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204164.24025: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204164.25929: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204164.25989: stderr chunk (state=3): >>><<< 15980 1727204164.25992: stdout chunk (state=3): >>><<< 15980 1727204164.26005: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204164.26015: handler run complete 15980 1727204164.26107: variable 'ansible_facts' from source: unknown 15980 1727204164.26186: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204164.26393: variable 'ansible_facts' from source: unknown 15980 1727204164.26452: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204164.26534: attempt loop complete, returning result 15980 1727204164.26538: _execute() done 15980 1727204164.26541: dumping result to json 15980 1727204164.26560: done dumping result, returning 15980 1727204164.26571: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [127b8e07-fff9-5f1d-4b72-0000000002b5] 15980 1727204164.26575: sending task result for task 127b8e07-fff9-5f1d-4b72-0000000002b5 15980 1727204164.26821: done sending task result for task 127b8e07-fff9-5f1d-4b72-0000000002b5 15980 1727204164.26824: WORKER PROCESS EXITING ok: [managed-node2] 15980 1727204164.27061: no more pending results, returning what we have 15980 1727204164.27063: results queue empty 15980 1727204164.27064: checking for any_errors_fatal 15980 1727204164.27066: done checking for any_errors_fatal 15980 1727204164.27067: checking for max_fail_percentage 15980 1727204164.27069: done checking for max_fail_percentage 15980 1727204164.27069: checking to see if all hosts have failed and the running result is not ok 15980 1727204164.27070: done checking to see if all hosts have failed 15980 1727204164.27070: getting the remaining hosts for this loop 15980 1727204164.27071: done getting the remaining hosts for this loop 15980 1727204164.27074: getting the next task for host managed-node2 15980 1727204164.27078: done getting next task for host managed-node2 15980 1727204164.27080: ^ task is: TASK: meta (flush_handlers) 15980 1727204164.27081: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204164.27084: getting variables 15980 1727204164.27085: in VariableManager get_vars() 15980 1727204164.27108: Calling all_inventory to load vars for managed-node2 15980 1727204164.27110: Calling groups_inventory to load vars for managed-node2 15980 1727204164.27111: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204164.27120: Calling all_plugins_play to load vars for managed-node2 15980 1727204164.27122: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204164.27124: Calling groups_plugins_play to load vars for managed-node2 15980 1727204164.28640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204164.30547: done with get_vars() 15980 1727204164.30594: done getting variables 15980 1727204164.30705: in VariableManager get_vars() 15980 1727204164.30730: Calling all_inventory to load vars for managed-node2 15980 1727204164.30733: Calling groups_inventory to load vars for managed-node2 15980 1727204164.30735: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204164.30741: Calling all_plugins_play to load vars for managed-node2 15980 1727204164.30744: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204164.30747: Calling groups_plugins_play to load vars for managed-node2 15980 1727204164.32082: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204164.33541: done with get_vars() 15980 1727204164.33573: done queuing things up, now waiting for results queue to drain 15980 1727204164.33575: results queue empty 15980 1727204164.33576: checking for any_errors_fatal 15980 1727204164.33579: done checking for any_errors_fatal 15980 1727204164.33579: checking for max_fail_percentage 15980 1727204164.33580: done checking for max_fail_percentage 15980 1727204164.33585: checking to see if all hosts have failed and the running result is not ok 15980 1727204164.33585: done checking to see if all hosts have failed 15980 1727204164.33586: getting the remaining hosts for this loop 15980 1727204164.33587: done getting the remaining hosts for this loop 15980 1727204164.33589: getting the next task for host managed-node2 15980 1727204164.33592: done getting next task for host managed-node2 15980 1727204164.33594: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15980 1727204164.33595: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204164.33604: getting variables 15980 1727204164.33604: in VariableManager get_vars() 15980 1727204164.33617: Calling all_inventory to load vars for managed-node2 15980 1727204164.33618: Calling groups_inventory to load vars for managed-node2 15980 1727204164.33619: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204164.33624: Calling all_plugins_play to load vars for managed-node2 15980 1727204164.33627: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204164.33629: Calling groups_plugins_play to load vars for managed-node2 15980 1727204164.34517: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204164.35694: done with get_vars() 15980 1727204164.35720: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:56:04 -0400 (0:00:01.018) 0:00:25.768 ***** 15980 1727204164.35784: entering _queue_task() for managed-node2/include_tasks 15980 1727204164.36070: worker is 1 (out of 1 available) 15980 1727204164.36087: exiting _queue_task() for managed-node2/include_tasks 15980 1727204164.36098: done queuing things up, now waiting for results queue to drain 15980 1727204164.36101: waiting for pending results... 15980 1727204164.36298: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15980 1727204164.36379: in run() - task 127b8e07-fff9-5f1d-4b72-00000000003a 15980 1727204164.36392: variable 'ansible_search_path' from source: unknown 15980 1727204164.36395: variable 'ansible_search_path' from source: unknown 15980 1727204164.36572: calling self._execute() 15980 1727204164.36579: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204164.36592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204164.36611: variable 'omit' from source: magic vars 15980 1727204164.37068: variable 'ansible_distribution_major_version' from source: facts 15980 1727204164.37103: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204164.37111: _execute() done 15980 1727204164.37115: dumping result to json 15980 1727204164.37117: done dumping result, returning 15980 1727204164.37121: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [127b8e07-fff9-5f1d-4b72-00000000003a] 15980 1727204164.37125: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000003a 15980 1727204164.37265: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000003a 15980 1727204164.37268: WORKER PROCESS EXITING 15980 1727204164.37387: no more pending results, returning what we have 15980 1727204164.37392: in VariableManager get_vars() 15980 1727204164.37496: Calling all_inventory to load vars for managed-node2 15980 1727204164.37500: Calling groups_inventory to load vars for managed-node2 15980 1727204164.37503: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204164.37515: Calling all_plugins_play to load vars for managed-node2 15980 1727204164.37518: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204164.37522: Calling groups_plugins_play to load vars for managed-node2 15980 1727204164.39140: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204164.40523: done with get_vars() 15980 1727204164.40552: variable 'ansible_search_path' from source: unknown 15980 1727204164.40553: variable 'ansible_search_path' from source: unknown 15980 1727204164.40581: we have included files to process 15980 1727204164.40582: generating all_blocks data 15980 1727204164.40583: done generating all_blocks data 15980 1727204164.40584: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15980 1727204164.40584: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15980 1727204164.40586: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15980 1727204164.41014: done processing included file 15980 1727204164.41015: iterating over new_blocks loaded from include file 15980 1727204164.41016: in VariableManager get_vars() 15980 1727204164.41034: done with get_vars() 15980 1727204164.41035: filtering new block on tags 15980 1727204164.41047: done filtering new block on tags 15980 1727204164.41049: in VariableManager get_vars() 15980 1727204164.41064: done with get_vars() 15980 1727204164.41066: filtering new block on tags 15980 1727204164.41080: done filtering new block on tags 15980 1727204164.41081: in VariableManager get_vars() 15980 1727204164.41094: done with get_vars() 15980 1727204164.41095: filtering new block on tags 15980 1727204164.41105: done filtering new block on tags 15980 1727204164.41106: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 15980 1727204164.41111: extending task lists for all hosts with included blocks 15980 1727204164.41490: done extending task lists 15980 1727204164.41491: done processing included files 15980 1727204164.41492: results queue empty 15980 1727204164.41493: checking for any_errors_fatal 15980 1727204164.41494: done checking for any_errors_fatal 15980 1727204164.41495: checking for max_fail_percentage 15980 1727204164.41496: done checking for max_fail_percentage 15980 1727204164.41497: checking to see if all hosts have failed and the running result is not ok 15980 1727204164.41498: done checking to see if all hosts have failed 15980 1727204164.41499: getting the remaining hosts for this loop 15980 1727204164.41500: done getting the remaining hosts for this loop 15980 1727204164.41506: getting the next task for host managed-node2 15980 1727204164.41514: done getting next task for host managed-node2 15980 1727204164.41518: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15980 1727204164.41520: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204164.41534: getting variables 15980 1727204164.41535: in VariableManager get_vars() 15980 1727204164.41560: Calling all_inventory to load vars for managed-node2 15980 1727204164.41563: Calling groups_inventory to load vars for managed-node2 15980 1727204164.41567: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204164.41573: Calling all_plugins_play to load vars for managed-node2 15980 1727204164.41576: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204164.41579: Calling groups_plugins_play to load vars for managed-node2 15980 1727204164.42523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204164.43916: done with get_vars() 15980 1727204164.43956: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:56:04 -0400 (0:00:00.082) 0:00:25.850 ***** 15980 1727204164.44036: entering _queue_task() for managed-node2/setup 15980 1727204164.44366: worker is 1 (out of 1 available) 15980 1727204164.44382: exiting _queue_task() for managed-node2/setup 15980 1727204164.44394: done queuing things up, now waiting for results queue to drain 15980 1727204164.44397: waiting for pending results... 15980 1727204164.44671: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15980 1727204164.44776: in run() - task 127b8e07-fff9-5f1d-4b72-0000000002f6 15980 1727204164.44787: variable 'ansible_search_path' from source: unknown 15980 1727204164.44791: variable 'ansible_search_path' from source: unknown 15980 1727204164.44827: calling self._execute() 15980 1727204164.44917: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204164.44928: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204164.44946: variable 'omit' from source: magic vars 15980 1727204164.45341: variable 'ansible_distribution_major_version' from source: facts 15980 1727204164.45346: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204164.45602: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15980 1727204164.48079: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15980 1727204164.48095: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15980 1727204164.48154: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15980 1727204164.48196: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15980 1727204164.48231: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15980 1727204164.48364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204164.48399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204164.48470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204164.48505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204164.48622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204164.48713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204164.48826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204164.48830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204164.48911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204164.48915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204164.49251: variable '__network_required_facts' from source: role '' defaults 15980 1727204164.49332: variable 'ansible_facts' from source: unknown 15980 1727204164.50521: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 15980 1727204164.50571: when evaluation is False, skipping this task 15980 1727204164.50575: _execute() done 15980 1727204164.50578: dumping result to json 15980 1727204164.50580: done dumping result, returning 15980 1727204164.50583: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [127b8e07-fff9-5f1d-4b72-0000000002f6] 15980 1727204164.50586: sending task result for task 127b8e07-fff9-5f1d-4b72-0000000002f6 skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15980 1727204164.50744: no more pending results, returning what we have 15980 1727204164.50750: results queue empty 15980 1727204164.50750: checking for any_errors_fatal 15980 1727204164.50751: done checking for any_errors_fatal 15980 1727204164.50752: checking for max_fail_percentage 15980 1727204164.50754: done checking for max_fail_percentage 15980 1727204164.50755: checking to see if all hosts have failed and the running result is not ok 15980 1727204164.50756: done checking to see if all hosts have failed 15980 1727204164.50756: getting the remaining hosts for this loop 15980 1727204164.50758: done getting the remaining hosts for this loop 15980 1727204164.50763: getting the next task for host managed-node2 15980 1727204164.50773: done getting next task for host managed-node2 15980 1727204164.50780: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 15980 1727204164.50783: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204164.50798: getting variables 15980 1727204164.50800: in VariableManager get_vars() 15980 1727204164.50846: Calling all_inventory to load vars for managed-node2 15980 1727204164.50849: Calling groups_inventory to load vars for managed-node2 15980 1727204164.50851: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204164.51091: Calling all_plugins_play to load vars for managed-node2 15980 1727204164.51096: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204164.51101: Calling groups_plugins_play to load vars for managed-node2 15980 1727204164.51654: done sending task result for task 127b8e07-fff9-5f1d-4b72-0000000002f6 15980 1727204164.51659: WORKER PROCESS EXITING 15980 1727204164.54250: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204164.55799: done with get_vars() 15980 1727204164.55831: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:56:04 -0400 (0:00:00.118) 0:00:25.969 ***** 15980 1727204164.55913: entering _queue_task() for managed-node2/stat 15980 1727204164.56221: worker is 1 (out of 1 available) 15980 1727204164.56239: exiting _queue_task() for managed-node2/stat 15980 1727204164.56252: done queuing things up, now waiting for results queue to drain 15980 1727204164.56255: waiting for pending results... 15980 1727204164.56454: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 15980 1727204164.56545: in run() - task 127b8e07-fff9-5f1d-4b72-0000000002f8 15980 1727204164.56560: variable 'ansible_search_path' from source: unknown 15980 1727204164.56564: variable 'ansible_search_path' from source: unknown 15980 1727204164.56600: calling self._execute() 15980 1727204164.56680: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204164.56689: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204164.56750: variable 'omit' from source: magic vars 15980 1727204164.57101: variable 'ansible_distribution_major_version' from source: facts 15980 1727204164.57105: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204164.57269: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15980 1727204164.57492: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15980 1727204164.57530: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15980 1727204164.57569: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15980 1727204164.57634: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15980 1727204164.57745: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15980 1727204164.57756: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15980 1727204164.57779: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204164.57801: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15980 1727204164.57893: variable '__network_is_ostree' from source: set_fact 15980 1727204164.57902: Evaluated conditional (not __network_is_ostree is defined): False 15980 1727204164.57905: when evaluation is False, skipping this task 15980 1727204164.57908: _execute() done 15980 1727204164.57910: dumping result to json 15980 1727204164.57912: done dumping result, returning 15980 1727204164.57925: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [127b8e07-fff9-5f1d-4b72-0000000002f8] 15980 1727204164.57930: sending task result for task 127b8e07-fff9-5f1d-4b72-0000000002f8 15980 1727204164.58040: done sending task result for task 127b8e07-fff9-5f1d-4b72-0000000002f8 15980 1727204164.58043: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15980 1727204164.58097: no more pending results, returning what we have 15980 1727204164.58101: results queue empty 15980 1727204164.58102: checking for any_errors_fatal 15980 1727204164.58107: done checking for any_errors_fatal 15980 1727204164.58108: checking for max_fail_percentage 15980 1727204164.58109: done checking for max_fail_percentage 15980 1727204164.58110: checking to see if all hosts have failed and the running result is not ok 15980 1727204164.58112: done checking to see if all hosts have failed 15980 1727204164.58112: getting the remaining hosts for this loop 15980 1727204164.58114: done getting the remaining hosts for this loop 15980 1727204164.58119: getting the next task for host managed-node2 15980 1727204164.58125: done getting next task for host managed-node2 15980 1727204164.58129: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15980 1727204164.58134: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204164.58151: getting variables 15980 1727204164.58153: in VariableManager get_vars() 15980 1727204164.58196: Calling all_inventory to load vars for managed-node2 15980 1727204164.58199: Calling groups_inventory to load vars for managed-node2 15980 1727204164.58201: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204164.58212: Calling all_plugins_play to load vars for managed-node2 15980 1727204164.58215: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204164.58218: Calling groups_plugins_play to load vars for managed-node2 15980 1727204164.59610: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204164.61239: done with get_vars() 15980 1727204164.61272: done getting variables 15980 1727204164.61321: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:56:04 -0400 (0:00:00.054) 0:00:26.023 ***** 15980 1727204164.61355: entering _queue_task() for managed-node2/set_fact 15980 1727204164.61739: worker is 1 (out of 1 available) 15980 1727204164.61755: exiting _queue_task() for managed-node2/set_fact 15980 1727204164.61773: done queuing things up, now waiting for results queue to drain 15980 1727204164.61778: waiting for pending results... 15980 1727204164.61995: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15980 1727204164.62099: in run() - task 127b8e07-fff9-5f1d-4b72-0000000002f9 15980 1727204164.62112: variable 'ansible_search_path' from source: unknown 15980 1727204164.62116: variable 'ansible_search_path' from source: unknown 15980 1727204164.62151: calling self._execute() 15980 1727204164.62231: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204164.62244: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204164.62259: variable 'omit' from source: magic vars 15980 1727204164.62576: variable 'ansible_distribution_major_version' from source: facts 15980 1727204164.62588: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204164.62724: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15980 1727204164.62946: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15980 1727204164.62984: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15980 1727204164.63013: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15980 1727204164.63042: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15980 1727204164.63114: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15980 1727204164.63138: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15980 1727204164.63158: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204164.63215: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15980 1727204164.63331: variable '__network_is_ostree' from source: set_fact 15980 1727204164.63334: Evaluated conditional (not __network_is_ostree is defined): False 15980 1727204164.63337: when evaluation is False, skipping this task 15980 1727204164.63339: _execute() done 15980 1727204164.63342: dumping result to json 15980 1727204164.63344: done dumping result, returning 15980 1727204164.63367: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [127b8e07-fff9-5f1d-4b72-0000000002f9] 15980 1727204164.63372: sending task result for task 127b8e07-fff9-5f1d-4b72-0000000002f9 15980 1727204164.63473: done sending task result for task 127b8e07-fff9-5f1d-4b72-0000000002f9 15980 1727204164.63477: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15980 1727204164.63525: no more pending results, returning what we have 15980 1727204164.63529: results queue empty 15980 1727204164.63530: checking for any_errors_fatal 15980 1727204164.63536: done checking for any_errors_fatal 15980 1727204164.63537: checking for max_fail_percentage 15980 1727204164.63538: done checking for max_fail_percentage 15980 1727204164.63540: checking to see if all hosts have failed and the running result is not ok 15980 1727204164.63541: done checking to see if all hosts have failed 15980 1727204164.63542: getting the remaining hosts for this loop 15980 1727204164.63543: done getting the remaining hosts for this loop 15980 1727204164.63548: getting the next task for host managed-node2 15980 1727204164.63557: done getting next task for host managed-node2 15980 1727204164.63561: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 15980 1727204164.63564: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204164.63581: getting variables 15980 1727204164.63583: in VariableManager get_vars() 15980 1727204164.63624: Calling all_inventory to load vars for managed-node2 15980 1727204164.63627: Calling groups_inventory to load vars for managed-node2 15980 1727204164.63657: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204164.63671: Calling all_plugins_play to load vars for managed-node2 15980 1727204164.63674: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204164.63677: Calling groups_plugins_play to load vars for managed-node2 15980 1727204164.64914: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204164.66929: done with get_vars() 15980 1727204164.66959: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:56:04 -0400 (0:00:00.056) 0:00:26.080 ***** 15980 1727204164.67044: entering _queue_task() for managed-node2/service_facts 15980 1727204164.67409: worker is 1 (out of 1 available) 15980 1727204164.67428: exiting _queue_task() for managed-node2/service_facts 15980 1727204164.67441: done queuing things up, now waiting for results queue to drain 15980 1727204164.67444: waiting for pending results... 15980 1727204164.68091: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 15980 1727204164.68150: in run() - task 127b8e07-fff9-5f1d-4b72-0000000002fb 15980 1727204164.68167: variable 'ansible_search_path' from source: unknown 15980 1727204164.68171: variable 'ansible_search_path' from source: unknown 15980 1727204164.68673: calling self._execute() 15980 1727204164.68677: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204164.68681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204164.68684: variable 'omit' from source: magic vars 15980 1727204164.69349: variable 'ansible_distribution_major_version' from source: facts 15980 1727204164.69374: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204164.69389: variable 'omit' from source: magic vars 15980 1727204164.69464: variable 'omit' from source: magic vars 15980 1727204164.69518: variable 'omit' from source: magic vars 15980 1727204164.69575: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204164.69621: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204164.69653: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204164.69682: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204164.69701: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204164.69740: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204164.69750: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204164.69758: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204164.69875: Set connection var ansible_connection to ssh 15980 1727204164.69889: Set connection var ansible_pipelining to False 15980 1727204164.69900: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204164.69911: Set connection var ansible_timeout to 10 15980 1727204164.69921: Set connection var ansible_shell_type to sh 15980 1727204164.69935: Set connection var ansible_shell_executable to /bin/sh 15980 1727204164.69973: variable 'ansible_shell_executable' from source: unknown 15980 1727204164.69983: variable 'ansible_connection' from source: unknown 15980 1727204164.69990: variable 'ansible_module_compression' from source: unknown 15980 1727204164.69998: variable 'ansible_shell_type' from source: unknown 15980 1727204164.70006: variable 'ansible_shell_executable' from source: unknown 15980 1727204164.70015: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204164.70023: variable 'ansible_pipelining' from source: unknown 15980 1727204164.70033: variable 'ansible_timeout' from source: unknown 15980 1727204164.70042: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204164.70264: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15980 1727204164.70287: variable 'omit' from source: magic vars 15980 1727204164.70297: starting attempt loop 15980 1727204164.70303: running the handler 15980 1727204164.70320: _low_level_execute_command(): starting 15980 1727204164.70334: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15980 1727204164.71097: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204164.71118: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204164.71139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204164.71254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204164.71287: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204164.71306: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204164.71417: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204164.73199: stdout chunk (state=3): >>>/root <<< 15980 1727204164.73404: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204164.73445: stdout chunk (state=3): >>><<< 15980 1727204164.73448: stderr chunk (state=3): >>><<< 15980 1727204164.73469: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204164.73490: _low_level_execute_command(): starting 15980 1727204164.73503: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204164.7347662-18235-231099063007476 `" && echo ansible-tmp-1727204164.7347662-18235-231099063007476="` echo /root/.ansible/tmp/ansible-tmp-1727204164.7347662-18235-231099063007476 `" ) && sleep 0' 15980 1727204164.74217: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204164.74339: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204164.74357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204164.74387: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204164.74405: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204164.74431: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204164.74563: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204164.76545: stdout chunk (state=3): >>>ansible-tmp-1727204164.7347662-18235-231099063007476=/root/.ansible/tmp/ansible-tmp-1727204164.7347662-18235-231099063007476 <<< 15980 1727204164.76677: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204164.76758: stderr chunk (state=3): >>><<< 15980 1727204164.76970: stdout chunk (state=3): >>><<< 15980 1727204164.76974: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204164.7347662-18235-231099063007476=/root/.ansible/tmp/ansible-tmp-1727204164.7347662-18235-231099063007476 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204164.76976: variable 'ansible_module_compression' from source: unknown 15980 1727204164.76979: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15980vtkmnsww/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 15980 1727204164.76981: variable 'ansible_facts' from source: unknown 15980 1727204164.77063: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204164.7347662-18235-231099063007476/AnsiballZ_service_facts.py 15980 1727204164.77231: Sending initial data 15980 1727204164.77331: Sent initial data (162 bytes) 15980 1727204164.78015: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204164.78105: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204164.78164: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204164.78206: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204164.78222: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204164.78332: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204164.79960: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15980 1727204164.80071: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15980 1727204164.80161: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15980vtkmnsww/tmpr57m49uo /root/.ansible/tmp/ansible-tmp-1727204164.7347662-18235-231099063007476/AnsiballZ_service_facts.py <<< 15980 1727204164.80165: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204164.7347662-18235-231099063007476/AnsiballZ_service_facts.py" <<< 15980 1727204164.80236: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15980vtkmnsww/tmpr57m49uo" to remote "/root/.ansible/tmp/ansible-tmp-1727204164.7347662-18235-231099063007476/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204164.7347662-18235-231099063007476/AnsiballZ_service_facts.py" <<< 15980 1727204164.81258: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204164.81275: stderr chunk (state=3): >>><<< 15980 1727204164.81284: stdout chunk (state=3): >>><<< 15980 1727204164.81313: done transferring module to remote 15980 1727204164.81347: _low_level_execute_command(): starting 15980 1727204164.81433: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204164.7347662-18235-231099063007476/ /root/.ansible/tmp/ansible-tmp-1727204164.7347662-18235-231099063007476/AnsiballZ_service_facts.py && sleep 0' 15980 1727204164.82076: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204164.82094: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204164.82110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204164.82143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15980 1727204164.82181: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 15980 1727204164.82194: stderr chunk (state=3): >>>debug2: match found <<< 15980 1727204164.82247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204164.82298: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204164.82319: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204164.82373: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204164.82481: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204164.84415: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204164.84436: stderr chunk (state=3): >>><<< 15980 1727204164.84445: stdout chunk (state=3): >>><<< 15980 1727204164.84473: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204164.84483: _low_level_execute_command(): starting 15980 1727204164.84518: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204164.7347662-18235-231099063007476/AnsiballZ_service_facts.py && sleep 0' 15980 1727204164.85362: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204164.85422: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204164.85511: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204167.02381: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 15980 1727204167.04176: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 15980 1727204167.04180: stdout chunk (state=3): >>><<< 15980 1727204167.04182: stderr chunk (state=3): >>><<< 15980 1727204167.04187: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 15980 1727204167.06653: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204164.7347662-18235-231099063007476/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15980 1727204167.06662: _low_level_execute_command(): starting 15980 1727204167.06669: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204164.7347662-18235-231099063007476/ > /dev/null 2>&1 && sleep 0' 15980 1727204167.07375: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204167.07380: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204167.07382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204167.07390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15980 1727204167.07393: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 15980 1727204167.07398: stderr chunk (state=3): >>>debug2: match not found <<< 15980 1727204167.07404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204167.07407: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15980 1727204167.07409: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 15980 1727204167.07411: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15980 1727204167.07418: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204167.07430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204167.07476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15980 1727204167.07479: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 15980 1727204167.07481: stderr chunk (state=3): >>>debug2: match found <<< 15980 1727204167.07483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204167.07542: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204167.07563: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204167.07577: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204167.07677: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204167.09820: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204167.10283: stderr chunk (state=3): >>><<< 15980 1727204167.10288: stdout chunk (state=3): >>><<< 15980 1727204167.10291: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204167.10293: handler run complete 15980 1727204167.10498: variable 'ansible_facts' from source: unknown 15980 1727204167.10950: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204167.11790: variable 'ansible_facts' from source: unknown 15980 1727204167.11982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204167.12288: attempt loop complete, returning result 15980 1727204167.12304: _execute() done 15980 1727204167.12311: dumping result to json 15980 1727204167.12397: done dumping result, returning 15980 1727204167.12418: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [127b8e07-fff9-5f1d-4b72-0000000002fb] 15980 1727204167.12427: sending task result for task 127b8e07-fff9-5f1d-4b72-0000000002fb 15980 1727204167.14374: done sending task result for task 127b8e07-fff9-5f1d-4b72-0000000002fb 15980 1727204167.14378: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15980 1727204167.14487: no more pending results, returning what we have 15980 1727204167.14490: results queue empty 15980 1727204167.14491: checking for any_errors_fatal 15980 1727204167.14493: done checking for any_errors_fatal 15980 1727204167.14494: checking for max_fail_percentage 15980 1727204167.14496: done checking for max_fail_percentage 15980 1727204167.14497: checking to see if all hosts have failed and the running result is not ok 15980 1727204167.14498: done checking to see if all hosts have failed 15980 1727204167.14498: getting the remaining hosts for this loop 15980 1727204167.14500: done getting the remaining hosts for this loop 15980 1727204167.14503: getting the next task for host managed-node2 15980 1727204167.14509: done getting next task for host managed-node2 15980 1727204167.14512: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 15980 1727204167.14515: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204167.14525: getting variables 15980 1727204167.14526: in VariableManager get_vars() 15980 1727204167.14556: Calling all_inventory to load vars for managed-node2 15980 1727204167.14559: Calling groups_inventory to load vars for managed-node2 15980 1727204167.14561: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204167.14573: Calling all_plugins_play to load vars for managed-node2 15980 1727204167.14576: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204167.14579: Calling groups_plugins_play to load vars for managed-node2 15980 1727204167.16591: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204167.28958: done with get_vars() 15980 1727204167.29021: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:56:07 -0400 (0:00:02.621) 0:00:28.702 ***** 15980 1727204167.29187: entering _queue_task() for managed-node2/package_facts 15980 1727204167.29802: worker is 1 (out of 1 available) 15980 1727204167.29814: exiting _queue_task() for managed-node2/package_facts 15980 1727204167.29827: done queuing things up, now waiting for results queue to drain 15980 1727204167.29830: waiting for pending results... 15980 1727204167.30079: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 15980 1727204167.30188: in run() - task 127b8e07-fff9-5f1d-4b72-0000000002fc 15980 1727204167.30210: variable 'ansible_search_path' from source: unknown 15980 1727204167.30218: variable 'ansible_search_path' from source: unknown 15980 1727204167.30262: calling self._execute() 15980 1727204167.30385: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204167.30398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204167.30414: variable 'omit' from source: magic vars 15980 1727204167.31124: variable 'ansible_distribution_major_version' from source: facts 15980 1727204167.31179: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204167.31257: variable 'omit' from source: magic vars 15980 1727204167.31308: variable 'omit' from source: magic vars 15980 1727204167.31355: variable 'omit' from source: magic vars 15980 1727204167.31413: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204167.31510: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204167.31559: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204167.31592: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204167.31610: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204167.31645: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204167.31669: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204167.31672: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204167.31779: Set connection var ansible_connection to ssh 15980 1727204167.31801: Set connection var ansible_pipelining to False 15980 1727204167.31870: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204167.31873: Set connection var ansible_timeout to 10 15980 1727204167.31876: Set connection var ansible_shell_type to sh 15980 1727204167.31878: Set connection var ansible_shell_executable to /bin/sh 15980 1727204167.31880: variable 'ansible_shell_executable' from source: unknown 15980 1727204167.31883: variable 'ansible_connection' from source: unknown 15980 1727204167.31885: variable 'ansible_module_compression' from source: unknown 15980 1727204167.31888: variable 'ansible_shell_type' from source: unknown 15980 1727204167.31890: variable 'ansible_shell_executable' from source: unknown 15980 1727204167.31895: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204167.31910: variable 'ansible_pipelining' from source: unknown 15980 1727204167.31918: variable 'ansible_timeout' from source: unknown 15980 1727204167.31925: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204167.32182: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15980 1727204167.32199: variable 'omit' from source: magic vars 15980 1727204167.32208: starting attempt loop 15980 1727204167.32215: running the handler 15980 1727204167.32238: _low_level_execute_command(): starting 15980 1727204167.32347: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15980 1727204167.33448: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204167.33524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204167.33564: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204167.33593: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204167.33610: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204167.33722: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204167.35486: stdout chunk (state=3): >>>/root <<< 15980 1727204167.35727: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204167.35854: stderr chunk (state=3): >>><<< 15980 1727204167.35875: stdout chunk (state=3): >>><<< 15980 1727204167.35902: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204167.36047: _low_level_execute_command(): starting 15980 1727204167.36052: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204167.3590882-18297-67798689942181 `" && echo ansible-tmp-1727204167.3590882-18297-67798689942181="` echo /root/.ansible/tmp/ansible-tmp-1727204167.3590882-18297-67798689942181 `" ) && sleep 0' 15980 1727204167.36943: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204167.36947: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204167.37093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204167.37355: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204167.37425: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204167.37463: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204167.37537: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204167.39656: stdout chunk (state=3): >>>ansible-tmp-1727204167.3590882-18297-67798689942181=/root/.ansible/tmp/ansible-tmp-1727204167.3590882-18297-67798689942181 <<< 15980 1727204167.39681: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204167.39927: stderr chunk (state=3): >>><<< 15980 1727204167.40036: stdout chunk (state=3): >>><<< 15980 1727204167.40040: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204167.3590882-18297-67798689942181=/root/.ansible/tmp/ansible-tmp-1727204167.3590882-18297-67798689942181 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204167.40045: variable 'ansible_module_compression' from source: unknown 15980 1727204167.40272: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15980vtkmnsww/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 15980 1727204167.40278: variable 'ansible_facts' from source: unknown 15980 1727204167.40550: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204167.3590882-18297-67798689942181/AnsiballZ_package_facts.py 15980 1727204167.40739: Sending initial data 15980 1727204167.40778: Sent initial data (161 bytes) 15980 1727204167.41607: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204167.41612: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204167.41635: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204167.41735: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204167.43369: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15980 1727204167.43429: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15980 1727204167.43529: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15980vtkmnsww/tmp5nyem0jg /root/.ansible/tmp/ansible-tmp-1727204167.3590882-18297-67798689942181/AnsiballZ_package_facts.py <<< 15980 1727204167.43539: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204167.3590882-18297-67798689942181/AnsiballZ_package_facts.py" <<< 15980 1727204167.43672: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15980vtkmnsww/tmp5nyem0jg" to remote "/root/.ansible/tmp/ansible-tmp-1727204167.3590882-18297-67798689942181/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204167.3590882-18297-67798689942181/AnsiballZ_package_facts.py" <<< 15980 1727204167.46764: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204167.46778: stdout chunk (state=3): >>><<< 15980 1727204167.46793: stderr chunk (state=3): >>><<< 15980 1727204167.47075: done transferring module to remote 15980 1727204167.47079: _low_level_execute_command(): starting 15980 1727204167.47082: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204167.3590882-18297-67798689942181/ /root/.ansible/tmp/ansible-tmp-1727204167.3590882-18297-67798689942181/AnsiballZ_package_facts.py && sleep 0' 15980 1727204167.48289: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204167.48490: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204167.48611: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204167.48749: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204167.50679: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204167.50896: stderr chunk (state=3): >>><<< 15980 1727204167.50904: stdout chunk (state=3): >>><<< 15980 1727204167.50918: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204167.50928: _low_level_execute_command(): starting 15980 1727204167.50931: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204167.3590882-18297-67798689942181/AnsiballZ_package_facts.py && sleep 0' 15980 1727204167.52211: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204167.52488: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204167.52515: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204167.52769: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204168.15106: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"na<<< 15980 1727204168.15132: stdout chunk (state=3): >>>me": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "<<< 15980 1727204168.15161: stdout chunk (state=3): >>>systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40",<<< 15980 1727204168.15188: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "li<<< 15980 1727204168.15201: stdout chunk (state=3): >>>breport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-l<<< 15980 1727204168.15322: stdout chunk (state=3): >>>ibs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarc<<< 15980 1727204168.15413: stdout chunk (state=3): >>>h", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 15980 1727204168.17173: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 15980 1727204168.17335: stderr chunk (state=3): >>><<< 15980 1727204168.17339: stdout chunk (state=3): >>><<< 15980 1727204168.17583: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 15980 1727204168.21514: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204167.3590882-18297-67798689942181/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15980 1727204168.21558: _low_level_execute_command(): starting 15980 1727204168.21573: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204167.3590882-18297-67798689942181/ > /dev/null 2>&1 && sleep 0' 15980 1727204168.22607: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204168.22646: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204168.22815: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204168.24761: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204168.24836: stderr chunk (state=3): >>><<< 15980 1727204168.24855: stdout chunk (state=3): >>><<< 15980 1727204168.24891: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204168.24898: handler run complete 15980 1727204168.25891: variable 'ansible_facts' from source: unknown 15980 1727204168.26779: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204168.28590: variable 'ansible_facts' from source: unknown 15980 1727204168.29057: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204168.30317: attempt loop complete, returning result 15980 1727204168.30322: _execute() done 15980 1727204168.30324: dumping result to json 15980 1727204168.31441: done dumping result, returning 15980 1727204168.31477: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [127b8e07-fff9-5f1d-4b72-0000000002fc] 15980 1727204168.31483: sending task result for task 127b8e07-fff9-5f1d-4b72-0000000002fc 15980 1727204168.35216: done sending task result for task 127b8e07-fff9-5f1d-4b72-0000000002fc 15980 1727204168.35220: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15980 1727204168.35359: no more pending results, returning what we have 15980 1727204168.35362: results queue empty 15980 1727204168.35363: checking for any_errors_fatal 15980 1727204168.35373: done checking for any_errors_fatal 15980 1727204168.35374: checking for max_fail_percentage 15980 1727204168.35376: done checking for max_fail_percentage 15980 1727204168.35377: checking to see if all hosts have failed and the running result is not ok 15980 1727204168.35380: done checking to see if all hosts have failed 15980 1727204168.35381: getting the remaining hosts for this loop 15980 1727204168.35382: done getting the remaining hosts for this loop 15980 1727204168.35385: getting the next task for host managed-node2 15980 1727204168.35392: done getting next task for host managed-node2 15980 1727204168.35396: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 15980 1727204168.35398: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204168.35406: getting variables 15980 1727204168.35409: in VariableManager get_vars() 15980 1727204168.35444: Calling all_inventory to load vars for managed-node2 15980 1727204168.35446: Calling groups_inventory to load vars for managed-node2 15980 1727204168.35448: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204168.35456: Calling all_plugins_play to load vars for managed-node2 15980 1727204168.35458: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204168.35460: Calling groups_plugins_play to load vars for managed-node2 15980 1727204168.36988: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204168.39303: done with get_vars() 15980 1727204168.39343: done getting variables 15980 1727204168.39416: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:56:08 -0400 (0:00:01.102) 0:00:29.804 ***** 15980 1727204168.39452: entering _queue_task() for managed-node2/debug 15980 1727204168.39848: worker is 1 (out of 1 available) 15980 1727204168.39863: exiting _queue_task() for managed-node2/debug 15980 1727204168.40078: done queuing things up, now waiting for results queue to drain 15980 1727204168.40080: waiting for pending results... 15980 1727204168.40217: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 15980 1727204168.40419: in run() - task 127b8e07-fff9-5f1d-4b72-00000000003b 15980 1727204168.40423: variable 'ansible_search_path' from source: unknown 15980 1727204168.40425: variable 'ansible_search_path' from source: unknown 15980 1727204168.40429: calling self._execute() 15980 1727204168.40529: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204168.40544: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204168.40636: variable 'omit' from source: magic vars 15980 1727204168.41093: variable 'ansible_distribution_major_version' from source: facts 15980 1727204168.41116: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204168.41129: variable 'omit' from source: magic vars 15980 1727204168.41190: variable 'omit' from source: magic vars 15980 1727204168.41324: variable 'network_provider' from source: set_fact 15980 1727204168.41351: variable 'omit' from source: magic vars 15980 1727204168.41402: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204168.41446: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204168.41475: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204168.41505: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204168.41527: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204168.41562: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204168.41574: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204168.41582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204168.41695: Set connection var ansible_connection to ssh 15980 1727204168.41709: Set connection var ansible_pipelining to False 15980 1727204168.41719: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204168.41838: Set connection var ansible_timeout to 10 15980 1727204168.41841: Set connection var ansible_shell_type to sh 15980 1727204168.41844: Set connection var ansible_shell_executable to /bin/sh 15980 1727204168.41846: variable 'ansible_shell_executable' from source: unknown 15980 1727204168.41849: variable 'ansible_connection' from source: unknown 15980 1727204168.41851: variable 'ansible_module_compression' from source: unknown 15980 1727204168.41854: variable 'ansible_shell_type' from source: unknown 15980 1727204168.41857: variable 'ansible_shell_executable' from source: unknown 15980 1727204168.41859: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204168.41861: variable 'ansible_pipelining' from source: unknown 15980 1727204168.41863: variable 'ansible_timeout' from source: unknown 15980 1727204168.41868: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204168.41993: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15980 1727204168.42011: variable 'omit' from source: magic vars 15980 1727204168.42021: starting attempt loop 15980 1727204168.42028: running the handler 15980 1727204168.42087: handler run complete 15980 1727204168.42109: attempt loop complete, returning result 15980 1727204168.42116: _execute() done 15980 1727204168.42123: dumping result to json 15980 1727204168.42130: done dumping result, returning 15980 1727204168.42144: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [127b8e07-fff9-5f1d-4b72-00000000003b] 15980 1727204168.42153: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000003b ok: [managed-node2] => {} MSG: Using network provider: nm 15980 1727204168.42341: no more pending results, returning what we have 15980 1727204168.42345: results queue empty 15980 1727204168.42346: checking for any_errors_fatal 15980 1727204168.42358: done checking for any_errors_fatal 15980 1727204168.42359: checking for max_fail_percentage 15980 1727204168.42360: done checking for max_fail_percentage 15980 1727204168.42361: checking to see if all hosts have failed and the running result is not ok 15980 1727204168.42362: done checking to see if all hosts have failed 15980 1727204168.42363: getting the remaining hosts for this loop 15980 1727204168.42367: done getting the remaining hosts for this loop 15980 1727204168.42372: getting the next task for host managed-node2 15980 1727204168.42379: done getting next task for host managed-node2 15980 1727204168.42384: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15980 1727204168.42386: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204168.42397: getting variables 15980 1727204168.42399: in VariableManager get_vars() 15980 1727204168.42443: Calling all_inventory to load vars for managed-node2 15980 1727204168.42446: Calling groups_inventory to load vars for managed-node2 15980 1727204168.42448: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204168.42461: Calling all_plugins_play to load vars for managed-node2 15980 1727204168.42672: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204168.42681: Calling groups_plugins_play to load vars for managed-node2 15980 1727204168.43390: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000003b 15980 1727204168.43394: WORKER PROCESS EXITING 15980 1727204168.45819: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204168.49895: done with get_vars() 15980 1727204168.49928: done getting variables 15980 1727204168.50000: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:56:08 -0400 (0:00:00.105) 0:00:29.910 ***** 15980 1727204168.50036: entering _queue_task() for managed-node2/fail 15980 1727204168.50432: worker is 1 (out of 1 available) 15980 1727204168.50445: exiting _queue_task() for managed-node2/fail 15980 1727204168.50458: done queuing things up, now waiting for results queue to drain 15980 1727204168.50459: waiting for pending results... 15980 1727204168.50778: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15980 1727204168.50887: in run() - task 127b8e07-fff9-5f1d-4b72-00000000003c 15980 1727204168.50901: variable 'ansible_search_path' from source: unknown 15980 1727204168.50905: variable 'ansible_search_path' from source: unknown 15980 1727204168.50946: calling self._execute() 15980 1727204168.51049: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204168.51055: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204168.51068: variable 'omit' from source: magic vars 15980 1727204168.51494: variable 'ansible_distribution_major_version' from source: facts 15980 1727204168.51506: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204168.51633: variable 'network_state' from source: role '' defaults 15980 1727204168.51641: Evaluated conditional (network_state != {}): False 15980 1727204168.51645: when evaluation is False, skipping this task 15980 1727204168.51649: _execute() done 15980 1727204168.51652: dumping result to json 15980 1727204168.51654: done dumping result, returning 15980 1727204168.51662: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [127b8e07-fff9-5f1d-4b72-00000000003c] 15980 1727204168.51670: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000003c 15980 1727204168.51779: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000003c 15980 1727204168.51782: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15980 1727204168.51846: no more pending results, returning what we have 15980 1727204168.51850: results queue empty 15980 1727204168.51851: checking for any_errors_fatal 15980 1727204168.51862: done checking for any_errors_fatal 15980 1727204168.51862: checking for max_fail_percentage 15980 1727204168.51864: done checking for max_fail_percentage 15980 1727204168.51870: checking to see if all hosts have failed and the running result is not ok 15980 1727204168.51871: done checking to see if all hosts have failed 15980 1727204168.51872: getting the remaining hosts for this loop 15980 1727204168.51874: done getting the remaining hosts for this loop 15980 1727204168.51878: getting the next task for host managed-node2 15980 1727204168.51883: done getting next task for host managed-node2 15980 1727204168.51888: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15980 1727204168.51890: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204168.51907: getting variables 15980 1727204168.51908: in VariableManager get_vars() 15980 1727204168.51962: Calling all_inventory to load vars for managed-node2 15980 1727204168.52163: Calling groups_inventory to load vars for managed-node2 15980 1727204168.52194: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204168.52216: Calling all_plugins_play to load vars for managed-node2 15980 1727204168.52220: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204168.52229: Calling groups_plugins_play to load vars for managed-node2 15980 1727204168.55781: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204168.58835: done with get_vars() 15980 1727204168.58876: done getting variables 15980 1727204168.58942: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:56:08 -0400 (0:00:00.089) 0:00:30.000 ***** 15980 1727204168.58975: entering _queue_task() for managed-node2/fail 15980 1727204168.59555: worker is 1 (out of 1 available) 15980 1727204168.59599: exiting _queue_task() for managed-node2/fail 15980 1727204168.59614: done queuing things up, now waiting for results queue to drain 15980 1727204168.59616: waiting for pending results... 15980 1727204168.59992: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15980 1727204168.60030: in run() - task 127b8e07-fff9-5f1d-4b72-00000000003d 15980 1727204168.60035: variable 'ansible_search_path' from source: unknown 15980 1727204168.60038: variable 'ansible_search_path' from source: unknown 15980 1727204168.60111: calling self._execute() 15980 1727204168.60169: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204168.60174: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204168.60183: variable 'omit' from source: magic vars 15980 1727204168.60639: variable 'ansible_distribution_major_version' from source: facts 15980 1727204168.60655: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204168.60802: variable 'network_state' from source: role '' defaults 15980 1727204168.60807: Evaluated conditional (network_state != {}): False 15980 1727204168.60811: when evaluation is False, skipping this task 15980 1727204168.60818: _execute() done 15980 1727204168.60821: dumping result to json 15980 1727204168.60824: done dumping result, returning 15980 1727204168.60831: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [127b8e07-fff9-5f1d-4b72-00000000003d] 15980 1727204168.60834: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000003d 15980 1727204168.61016: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000003d 15980 1727204168.61019: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15980 1727204168.61081: no more pending results, returning what we have 15980 1727204168.61085: results queue empty 15980 1727204168.61086: checking for any_errors_fatal 15980 1727204168.61092: done checking for any_errors_fatal 15980 1727204168.61093: checking for max_fail_percentage 15980 1727204168.61095: done checking for max_fail_percentage 15980 1727204168.61096: checking to see if all hosts have failed and the running result is not ok 15980 1727204168.61096: done checking to see if all hosts have failed 15980 1727204168.61097: getting the remaining hosts for this loop 15980 1727204168.61098: done getting the remaining hosts for this loop 15980 1727204168.61102: getting the next task for host managed-node2 15980 1727204168.61108: done getting next task for host managed-node2 15980 1727204168.61112: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15980 1727204168.61114: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204168.61133: getting variables 15980 1727204168.61134: in VariableManager get_vars() 15980 1727204168.61181: Calling all_inventory to load vars for managed-node2 15980 1727204168.61184: Calling groups_inventory to load vars for managed-node2 15980 1727204168.61186: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204168.61196: Calling all_plugins_play to load vars for managed-node2 15980 1727204168.61199: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204168.61202: Calling groups_plugins_play to load vars for managed-node2 15980 1727204168.64995: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204168.68757: done with get_vars() 15980 1727204168.68822: done getting variables 15980 1727204168.68954: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:56:08 -0400 (0:00:00.100) 0:00:30.100 ***** 15980 1727204168.68997: entering _queue_task() for managed-node2/fail 15980 1727204168.69556: worker is 1 (out of 1 available) 15980 1727204168.69613: exiting _queue_task() for managed-node2/fail 15980 1727204168.69649: done queuing things up, now waiting for results queue to drain 15980 1727204168.69652: waiting for pending results... 15980 1727204168.70028: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15980 1727204168.70033: in run() - task 127b8e07-fff9-5f1d-4b72-00000000003e 15980 1727204168.70038: variable 'ansible_search_path' from source: unknown 15980 1727204168.70041: variable 'ansible_search_path' from source: unknown 15980 1727204168.70044: calling self._execute() 15980 1727204168.70048: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204168.70052: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204168.70055: variable 'omit' from source: magic vars 15980 1727204168.70544: variable 'ansible_distribution_major_version' from source: facts 15980 1727204168.70548: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204168.70734: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15980 1727204168.73459: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15980 1727204168.73507: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15980 1727204168.73587: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15980 1727204168.73591: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15980 1727204168.73879: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15980 1727204168.73971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204168.74136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204168.74158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204168.74317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204168.74346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204168.74727: variable 'ansible_distribution_major_version' from source: facts 15980 1727204168.74735: Evaluated conditional (ansible_distribution_major_version | int > 9): True 15980 1727204168.75067: variable 'ansible_distribution' from source: facts 15980 1727204168.75071: variable '__network_rh_distros' from source: role '' defaults 15980 1727204168.75084: Evaluated conditional (ansible_distribution in __network_rh_distros): False 15980 1727204168.75092: when evaluation is False, skipping this task 15980 1727204168.75095: _execute() done 15980 1727204168.75098: dumping result to json 15980 1727204168.75100: done dumping result, returning 15980 1727204168.75113: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [127b8e07-fff9-5f1d-4b72-00000000003e] 15980 1727204168.75116: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000003e skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 15980 1727204168.75462: no more pending results, returning what we have 15980 1727204168.75469: results queue empty 15980 1727204168.75470: checking for any_errors_fatal 15980 1727204168.75479: done checking for any_errors_fatal 15980 1727204168.75480: checking for max_fail_percentage 15980 1727204168.75481: done checking for max_fail_percentage 15980 1727204168.75482: checking to see if all hosts have failed and the running result is not ok 15980 1727204168.75483: done checking to see if all hosts have failed 15980 1727204168.75484: getting the remaining hosts for this loop 15980 1727204168.75486: done getting the remaining hosts for this loop 15980 1727204168.75491: getting the next task for host managed-node2 15980 1727204168.75500: done getting next task for host managed-node2 15980 1727204168.75504: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15980 1727204168.75507: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204168.75522: getting variables 15980 1727204168.75529: in VariableManager get_vars() 15980 1727204168.75791: Calling all_inventory to load vars for managed-node2 15980 1727204168.75795: Calling groups_inventory to load vars for managed-node2 15980 1727204168.75798: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204168.75808: Calling all_plugins_play to load vars for managed-node2 15980 1727204168.75811: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204168.75820: Calling groups_plugins_play to load vars for managed-node2 15980 1727204168.76479: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000003e 15980 1727204168.76484: WORKER PROCESS EXITING 15980 1727204168.79756: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204168.82412: done with get_vars() 15980 1727204168.82458: done getting variables 15980 1727204168.82538: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:56:08 -0400 (0:00:00.135) 0:00:30.236 ***** 15980 1727204168.82584: entering _queue_task() for managed-node2/dnf 15980 1727204168.83118: worker is 1 (out of 1 available) 15980 1727204168.83133: exiting _queue_task() for managed-node2/dnf 15980 1727204168.83147: done queuing things up, now waiting for results queue to drain 15980 1727204168.83150: waiting for pending results... 15980 1727204168.83689: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15980 1727204168.83873: in run() - task 127b8e07-fff9-5f1d-4b72-00000000003f 15980 1727204168.83893: variable 'ansible_search_path' from source: unknown 15980 1727204168.83897: variable 'ansible_search_path' from source: unknown 15980 1727204168.83900: calling self._execute() 15980 1727204168.83903: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204168.83906: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204168.83909: variable 'omit' from source: magic vars 15980 1727204168.84188: variable 'ansible_distribution_major_version' from source: facts 15980 1727204168.84202: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204168.84474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15980 1727204168.88355: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15980 1727204168.88432: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15980 1727204168.88471: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15980 1727204168.88510: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15980 1727204168.88548: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15980 1727204168.88652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204168.88684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204168.88710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204168.88873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204168.88878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204168.88936: variable 'ansible_distribution' from source: facts 15980 1727204168.88945: variable 'ansible_distribution_major_version' from source: facts 15980 1727204168.88954: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 15980 1727204168.89101: variable '__network_wireless_connections_defined' from source: role '' defaults 15980 1727204168.89328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204168.89332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204168.89376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204168.89491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204168.89843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204168.89847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204168.89849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204168.89852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204168.90262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204168.90297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204168.90373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204168.90410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204168.90456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204168.90505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204168.90532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204168.90871: variable 'network_connections' from source: play vars 15980 1727204168.90874: variable 'profile' from source: play vars 15980 1727204168.90877: variable 'profile' from source: play vars 15980 1727204168.90880: variable 'interface' from source: set_fact 15980 1727204168.90916: variable 'interface' from source: set_fact 15980 1727204168.90993: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15980 1727204168.91271: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15980 1727204168.91274: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15980 1727204168.91277: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15980 1727204168.91299: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15980 1727204168.91352: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15980 1727204168.91376: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15980 1727204168.91402: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204168.91428: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15980 1727204168.91771: variable '__network_team_connections_defined' from source: role '' defaults 15980 1727204168.91783: variable 'network_connections' from source: play vars 15980 1727204168.91790: variable 'profile' from source: play vars 15980 1727204168.91856: variable 'profile' from source: play vars 15980 1727204168.91859: variable 'interface' from source: set_fact 15980 1727204168.91921: variable 'interface' from source: set_fact 15980 1727204168.91951: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15980 1727204168.91954: when evaluation is False, skipping this task 15980 1727204168.91957: _execute() done 15980 1727204168.91959: dumping result to json 15980 1727204168.91963: done dumping result, returning 15980 1727204168.91975: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [127b8e07-fff9-5f1d-4b72-00000000003f] 15980 1727204168.91984: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000003f skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15980 1727204168.92142: no more pending results, returning what we have 15980 1727204168.92145: results queue empty 15980 1727204168.92147: checking for any_errors_fatal 15980 1727204168.92153: done checking for any_errors_fatal 15980 1727204168.92154: checking for max_fail_percentage 15980 1727204168.92156: done checking for max_fail_percentage 15980 1727204168.92157: checking to see if all hosts have failed and the running result is not ok 15980 1727204168.92158: done checking to see if all hosts have failed 15980 1727204168.92159: getting the remaining hosts for this loop 15980 1727204168.92160: done getting the remaining hosts for this loop 15980 1727204168.92167: getting the next task for host managed-node2 15980 1727204168.92175: done getting next task for host managed-node2 15980 1727204168.92179: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15980 1727204168.92182: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204168.92200: getting variables 15980 1727204168.92201: in VariableManager get_vars() 15980 1727204168.92246: Calling all_inventory to load vars for managed-node2 15980 1727204168.92249: Calling groups_inventory to load vars for managed-node2 15980 1727204168.92251: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204168.92264: Calling all_plugins_play to load vars for managed-node2 15980 1727204168.92472: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204168.92478: Calling groups_plugins_play to load vars for managed-node2 15980 1727204168.93189: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000003f 15980 1727204168.93193: WORKER PROCESS EXITING 15980 1727204168.95526: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204168.99827: done with get_vars() 15980 1727204169.00173: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15980 1727204169.00263: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:56:09 -0400 (0:00:00.177) 0:00:30.413 ***** 15980 1727204169.00300: entering _queue_task() for managed-node2/yum 15980 1727204169.01093: worker is 1 (out of 1 available) 15980 1727204169.01110: exiting _queue_task() for managed-node2/yum 15980 1727204169.01122: done queuing things up, now waiting for results queue to drain 15980 1727204169.01125: waiting for pending results... 15980 1727204169.01652: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15980 1727204169.01977: in run() - task 127b8e07-fff9-5f1d-4b72-000000000040 15980 1727204169.01982: variable 'ansible_search_path' from source: unknown 15980 1727204169.01985: variable 'ansible_search_path' from source: unknown 15980 1727204169.02101: calling self._execute() 15980 1727204169.02422: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204169.02442: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204169.02457: variable 'omit' from source: magic vars 15980 1727204169.03259: variable 'ansible_distribution_major_version' from source: facts 15980 1727204169.03383: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204169.03855: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15980 1727204169.09928: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15980 1727204169.10010: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15980 1727204169.10058: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15980 1727204169.10102: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15980 1727204169.10136: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15980 1727204169.10233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204169.10272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204169.10302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204169.10354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204169.10375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204169.10489: variable 'ansible_distribution_major_version' from source: facts 15980 1727204169.10510: Evaluated conditional (ansible_distribution_major_version | int < 8): False 15980 1727204169.10517: when evaluation is False, skipping this task 15980 1727204169.10524: _execute() done 15980 1727204169.10534: dumping result to json 15980 1727204169.10542: done dumping result, returning 15980 1727204169.10553: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [127b8e07-fff9-5f1d-4b72-000000000040] 15980 1727204169.10563: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000040 skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 15980 1727204169.10730: no more pending results, returning what we have 15980 1727204169.10733: results queue empty 15980 1727204169.10735: checking for any_errors_fatal 15980 1727204169.10742: done checking for any_errors_fatal 15980 1727204169.10743: checking for max_fail_percentage 15980 1727204169.10745: done checking for max_fail_percentage 15980 1727204169.10745: checking to see if all hosts have failed and the running result is not ok 15980 1727204169.10746: done checking to see if all hosts have failed 15980 1727204169.10747: getting the remaining hosts for this loop 15980 1727204169.10749: done getting the remaining hosts for this loop 15980 1727204169.10753: getting the next task for host managed-node2 15980 1727204169.10759: done getting next task for host managed-node2 15980 1727204169.10763: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15980 1727204169.10767: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204169.10787: getting variables 15980 1727204169.10789: in VariableManager get_vars() 15980 1727204169.10830: Calling all_inventory to load vars for managed-node2 15980 1727204169.10833: Calling groups_inventory to load vars for managed-node2 15980 1727204169.10835: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204169.10847: Calling all_plugins_play to load vars for managed-node2 15980 1727204169.10850: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204169.10854: Calling groups_plugins_play to load vars for managed-node2 15980 1727204169.11476: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000040 15980 1727204169.11481: WORKER PROCESS EXITING 15980 1727204169.14713: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204169.18688: done with get_vars() 15980 1727204169.18735: done getting variables 15980 1727204169.18804: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:56:09 -0400 (0:00:00.185) 0:00:30.598 ***** 15980 1727204169.18848: entering _queue_task() for managed-node2/fail 15980 1727204169.19258: worker is 1 (out of 1 available) 15980 1727204169.19273: exiting _queue_task() for managed-node2/fail 15980 1727204169.19287: done queuing things up, now waiting for results queue to drain 15980 1727204169.19289: waiting for pending results... 15980 1727204169.19629: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15980 1727204169.19973: in run() - task 127b8e07-fff9-5f1d-4b72-000000000041 15980 1727204169.19978: variable 'ansible_search_path' from source: unknown 15980 1727204169.19981: variable 'ansible_search_path' from source: unknown 15980 1727204169.19984: calling self._execute() 15980 1727204169.19986: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204169.19989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204169.19991: variable 'omit' from source: magic vars 15980 1727204169.20572: variable 'ansible_distribution_major_version' from source: facts 15980 1727204169.20577: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204169.20580: variable '__network_wireless_connections_defined' from source: role '' defaults 15980 1727204169.20748: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15980 1727204169.24243: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15980 1727204169.24430: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15980 1727204169.24474: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15980 1727204169.24506: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15980 1727204169.24759: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15980 1727204169.24966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204169.25134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204169.25162: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204169.25252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204169.25671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204169.25675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204169.25678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204169.25681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204169.25683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204169.25686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204169.25689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204169.25692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204169.25695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204169.25742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204169.25755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204169.25958: variable 'network_connections' from source: play vars 15980 1727204169.25973: variable 'profile' from source: play vars 15980 1727204169.26054: variable 'profile' from source: play vars 15980 1727204169.26058: variable 'interface' from source: set_fact 15980 1727204169.26124: variable 'interface' from source: set_fact 15980 1727204169.26211: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15980 1727204169.26406: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15980 1727204169.26447: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15980 1727204169.26485: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15980 1727204169.26514: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15980 1727204169.26562: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15980 1727204169.26592: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15980 1727204169.26618: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204169.26644: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15980 1727204169.26700: variable '__network_team_connections_defined' from source: role '' defaults 15980 1727204169.26978: variable 'network_connections' from source: play vars 15980 1727204169.26984: variable 'profile' from source: play vars 15980 1727204169.27054: variable 'profile' from source: play vars 15980 1727204169.27058: variable 'interface' from source: set_fact 15980 1727204169.27128: variable 'interface' from source: set_fact 15980 1727204169.27154: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15980 1727204169.27158: when evaluation is False, skipping this task 15980 1727204169.27160: _execute() done 15980 1727204169.27163: dumping result to json 15980 1727204169.27167: done dumping result, returning 15980 1727204169.27176: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-5f1d-4b72-000000000041] 15980 1727204169.27188: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000041 15980 1727204169.27291: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000041 15980 1727204169.27295: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15980 1727204169.27349: no more pending results, returning what we have 15980 1727204169.27353: results queue empty 15980 1727204169.27354: checking for any_errors_fatal 15980 1727204169.27361: done checking for any_errors_fatal 15980 1727204169.27361: checking for max_fail_percentage 15980 1727204169.27363: done checking for max_fail_percentage 15980 1727204169.27364: checking to see if all hosts have failed and the running result is not ok 15980 1727204169.27367: done checking to see if all hosts have failed 15980 1727204169.27368: getting the remaining hosts for this loop 15980 1727204169.27369: done getting the remaining hosts for this loop 15980 1727204169.27374: getting the next task for host managed-node2 15980 1727204169.27381: done getting next task for host managed-node2 15980 1727204169.27386: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 15980 1727204169.27388: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204169.27405: getting variables 15980 1727204169.27407: in VariableManager get_vars() 15980 1727204169.27448: Calling all_inventory to load vars for managed-node2 15980 1727204169.27452: Calling groups_inventory to load vars for managed-node2 15980 1727204169.27455: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204169.27469: Calling all_plugins_play to load vars for managed-node2 15980 1727204169.27472: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204169.27476: Calling groups_plugins_play to load vars for managed-node2 15980 1727204169.29433: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204169.31725: done with get_vars() 15980 1727204169.31761: done getting variables 15980 1727204169.31828: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:56:09 -0400 (0:00:00.130) 0:00:30.728 ***** 15980 1727204169.31861: entering _queue_task() for managed-node2/package 15980 1727204169.32239: worker is 1 (out of 1 available) 15980 1727204169.32254: exiting _queue_task() for managed-node2/package 15980 1727204169.32372: done queuing things up, now waiting for results queue to drain 15980 1727204169.32375: waiting for pending results... 15980 1727204169.32986: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 15980 1727204169.32993: in run() - task 127b8e07-fff9-5f1d-4b72-000000000042 15980 1727204169.32997: variable 'ansible_search_path' from source: unknown 15980 1727204169.33001: variable 'ansible_search_path' from source: unknown 15980 1727204169.33009: calling self._execute() 15980 1727204169.33220: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204169.33224: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204169.33238: variable 'omit' from source: magic vars 15980 1727204169.34209: variable 'ansible_distribution_major_version' from source: facts 15980 1727204169.34214: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204169.34796: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15980 1727204169.35592: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15980 1727204169.35662: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15980 1727204169.35703: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15980 1727204169.35913: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15980 1727204169.36117: variable 'network_packages' from source: role '' defaults 15980 1727204169.36391: variable '__network_provider_setup' from source: role '' defaults 15980 1727204169.36408: variable '__network_service_name_default_nm' from source: role '' defaults 15980 1727204169.36528: variable '__network_service_name_default_nm' from source: role '' defaults 15980 1727204169.36655: variable '__network_packages_default_nm' from source: role '' defaults 15980 1727204169.36721: variable '__network_packages_default_nm' from source: role '' defaults 15980 1727204169.37270: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15980 1727204169.40273: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15980 1727204169.40344: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15980 1727204169.40384: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15980 1727204169.40418: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15980 1727204169.40449: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15980 1727204169.40670: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204169.40674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204169.40677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204169.40679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204169.40681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204169.40700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204169.40724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204169.40754: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204169.40800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204169.40815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204169.41074: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15980 1727204169.41472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204169.41477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204169.41479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204169.41482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204169.41484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204169.41487: variable 'ansible_python' from source: facts 15980 1727204169.41489: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15980 1727204169.41551: variable '__network_wpa_supplicant_required' from source: role '' defaults 15980 1727204169.41641: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15980 1727204169.41804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204169.41831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204169.41861: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204169.41907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204169.41919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204169.41970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204169.41994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204169.42020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204169.42059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204169.42075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204169.42228: variable 'network_connections' from source: play vars 15980 1727204169.42232: variable 'profile' from source: play vars 15980 1727204169.42339: variable 'profile' from source: play vars 15980 1727204169.42505: variable 'interface' from source: set_fact 15980 1727204169.42508: variable 'interface' from source: set_fact 15980 1727204169.42610: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15980 1727204169.42771: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15980 1727204169.42774: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204169.42776: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15980 1727204169.42779: variable '__network_wireless_connections_defined' from source: role '' defaults 15980 1727204169.43609: variable 'network_connections' from source: play vars 15980 1727204169.43613: variable 'profile' from source: play vars 15980 1727204169.43839: variable 'profile' from source: play vars 15980 1727204169.43845: variable 'interface' from source: set_fact 15980 1727204169.44038: variable 'interface' from source: set_fact 15980 1727204169.44194: variable '__network_packages_default_wireless' from source: role '' defaults 15980 1727204169.44280: variable '__network_wireless_connections_defined' from source: role '' defaults 15980 1727204169.45373: variable 'network_connections' from source: play vars 15980 1727204169.45376: variable 'profile' from source: play vars 15980 1727204169.45378: variable 'profile' from source: play vars 15980 1727204169.45380: variable 'interface' from source: set_fact 15980 1727204169.45537: variable 'interface' from source: set_fact 15980 1727204169.45571: variable '__network_packages_default_team' from source: role '' defaults 15980 1727204169.45649: variable '__network_team_connections_defined' from source: role '' defaults 15980 1727204169.45980: variable 'network_connections' from source: play vars 15980 1727204169.45983: variable 'profile' from source: play vars 15980 1727204169.46055: variable 'profile' from source: play vars 15980 1727204169.46059: variable 'interface' from source: set_fact 15980 1727204169.46278: variable 'interface' from source: set_fact 15980 1727204169.46339: variable '__network_service_name_default_initscripts' from source: role '' defaults 15980 1727204169.46619: variable '__network_service_name_default_initscripts' from source: role '' defaults 15980 1727204169.46629: variable '__network_packages_default_initscripts' from source: role '' defaults 15980 1727204169.46797: variable '__network_packages_default_initscripts' from source: role '' defaults 15980 1727204169.47388: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15980 1727204169.48595: variable 'network_connections' from source: play vars 15980 1727204169.48599: variable 'profile' from source: play vars 15980 1727204169.48788: variable 'profile' from source: play vars 15980 1727204169.48792: variable 'interface' from source: set_fact 15980 1727204169.48861: variable 'interface' from source: set_fact 15980 1727204169.48980: variable 'ansible_distribution' from source: facts 15980 1727204169.48984: variable '__network_rh_distros' from source: role '' defaults 15980 1727204169.49044: variable 'ansible_distribution_major_version' from source: facts 15980 1727204169.49047: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15980 1727204169.49420: variable 'ansible_distribution' from source: facts 15980 1727204169.49424: variable '__network_rh_distros' from source: role '' defaults 15980 1727204169.49429: variable 'ansible_distribution_major_version' from source: facts 15980 1727204169.49436: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15980 1727204169.49798: variable 'ansible_distribution' from source: facts 15980 1727204169.49802: variable '__network_rh_distros' from source: role '' defaults 15980 1727204169.49805: variable 'ansible_distribution_major_version' from source: facts 15980 1727204169.49955: variable 'network_provider' from source: set_fact 15980 1727204169.50017: variable 'ansible_facts' from source: unknown 15980 1727204169.51922: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 15980 1727204169.51929: when evaluation is False, skipping this task 15980 1727204169.51931: _execute() done 15980 1727204169.51934: dumping result to json 15980 1727204169.51936: done dumping result, returning 15980 1727204169.51943: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [127b8e07-fff9-5f1d-4b72-000000000042] 15980 1727204169.51949: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000042 skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 15980 1727204169.52225: no more pending results, returning what we have 15980 1727204169.52230: results queue empty 15980 1727204169.52231: checking for any_errors_fatal 15980 1727204169.52241: done checking for any_errors_fatal 15980 1727204169.52242: checking for max_fail_percentage 15980 1727204169.52244: done checking for max_fail_percentage 15980 1727204169.52245: checking to see if all hosts have failed and the running result is not ok 15980 1727204169.52246: done checking to see if all hosts have failed 15980 1727204169.52247: getting the remaining hosts for this loop 15980 1727204169.52249: done getting the remaining hosts for this loop 15980 1727204169.52253: getting the next task for host managed-node2 15980 1727204169.52261: done getting next task for host managed-node2 15980 1727204169.52268: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15980 1727204169.52272: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204169.52294: getting variables 15980 1727204169.52296: in VariableManager get_vars() 15980 1727204169.52342: Calling all_inventory to load vars for managed-node2 15980 1727204169.52345: Calling groups_inventory to load vars for managed-node2 15980 1727204169.52348: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204169.52362: Calling all_plugins_play to load vars for managed-node2 15980 1727204169.52576: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204169.52581: Calling groups_plugins_play to load vars for managed-node2 15980 1727204169.53135: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000042 15980 1727204169.53141: WORKER PROCESS EXITING 15980 1727204169.56915: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204169.61622: done with get_vars() 15980 1727204169.61788: done getting variables 15980 1727204169.61863: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:56:09 -0400 (0:00:00.301) 0:00:31.030 ***** 15980 1727204169.62008: entering _queue_task() for managed-node2/package 15980 1727204169.62764: worker is 1 (out of 1 available) 15980 1727204169.62897: exiting _queue_task() for managed-node2/package 15980 1727204169.62913: done queuing things up, now waiting for results queue to drain 15980 1727204169.62916: waiting for pending results... 15980 1727204169.63373: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15980 1727204169.63471: in run() - task 127b8e07-fff9-5f1d-4b72-000000000043 15980 1727204169.63600: variable 'ansible_search_path' from source: unknown 15980 1727204169.63605: variable 'ansible_search_path' from source: unknown 15980 1727204169.63655: calling self._execute() 15980 1727204169.63762: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204169.63770: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204169.63773: variable 'omit' from source: magic vars 15980 1727204169.64727: variable 'ansible_distribution_major_version' from source: facts 15980 1727204169.64857: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204169.65120: variable 'network_state' from source: role '' defaults 15980 1727204169.65133: Evaluated conditional (network_state != {}): False 15980 1727204169.65136: when evaluation is False, skipping this task 15980 1727204169.65140: _execute() done 15980 1727204169.65142: dumping result to json 15980 1727204169.65145: done dumping result, returning 15980 1727204169.65155: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [127b8e07-fff9-5f1d-4b72-000000000043] 15980 1727204169.65161: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000043 15980 1727204169.65278: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000043 15980 1727204169.65282: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15980 1727204169.65341: no more pending results, returning what we have 15980 1727204169.65346: results queue empty 15980 1727204169.65348: checking for any_errors_fatal 15980 1727204169.65358: done checking for any_errors_fatal 15980 1727204169.65359: checking for max_fail_percentage 15980 1727204169.65361: done checking for max_fail_percentage 15980 1727204169.65362: checking to see if all hosts have failed and the running result is not ok 15980 1727204169.65363: done checking to see if all hosts have failed 15980 1727204169.65364: getting the remaining hosts for this loop 15980 1727204169.65369: done getting the remaining hosts for this loop 15980 1727204169.65375: getting the next task for host managed-node2 15980 1727204169.65381: done getting next task for host managed-node2 15980 1727204169.65387: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15980 1727204169.65390: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204169.65408: getting variables 15980 1727204169.65410: in VariableManager get_vars() 15980 1727204169.65458: Calling all_inventory to load vars for managed-node2 15980 1727204169.65462: Calling groups_inventory to load vars for managed-node2 15980 1727204169.65465: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204169.65495: Calling all_plugins_play to load vars for managed-node2 15980 1727204169.65499: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204169.65574: Calling groups_plugins_play to load vars for managed-node2 15980 1727204169.68035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204169.70438: done with get_vars() 15980 1727204169.70538: done getting variables 15980 1727204169.70608: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:56:09 -0400 (0:00:00.086) 0:00:31.116 ***** 15980 1727204169.70642: entering _queue_task() for managed-node2/package 15980 1727204169.71057: worker is 1 (out of 1 available) 15980 1727204169.71072: exiting _queue_task() for managed-node2/package 15980 1727204169.71085: done queuing things up, now waiting for results queue to drain 15980 1727204169.71087: waiting for pending results... 15980 1727204169.71493: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15980 1727204169.71499: in run() - task 127b8e07-fff9-5f1d-4b72-000000000044 15980 1727204169.71517: variable 'ansible_search_path' from source: unknown 15980 1727204169.71522: variable 'ansible_search_path' from source: unknown 15980 1727204169.71560: calling self._execute() 15980 1727204169.71669: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204169.71677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204169.71688: variable 'omit' from source: magic vars 15980 1727204169.72135: variable 'ansible_distribution_major_version' from source: facts 15980 1727204169.72140: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204169.72271: variable 'network_state' from source: role '' defaults 15980 1727204169.72351: Evaluated conditional (network_state != {}): False 15980 1727204169.72354: when evaluation is False, skipping this task 15980 1727204169.72357: _execute() done 15980 1727204169.72360: dumping result to json 15980 1727204169.72362: done dumping result, returning 15980 1727204169.72366: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [127b8e07-fff9-5f1d-4b72-000000000044] 15980 1727204169.72369: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000044 15980 1727204169.72575: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000044 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15980 1727204169.72622: no more pending results, returning what we have 15980 1727204169.72626: results queue empty 15980 1727204169.72627: checking for any_errors_fatal 15980 1727204169.72633: done checking for any_errors_fatal 15980 1727204169.72634: checking for max_fail_percentage 15980 1727204169.72635: done checking for max_fail_percentage 15980 1727204169.72636: checking to see if all hosts have failed and the running result is not ok 15980 1727204169.72637: done checking to see if all hosts have failed 15980 1727204169.72638: getting the remaining hosts for this loop 15980 1727204169.72639: done getting the remaining hosts for this loop 15980 1727204169.72643: getting the next task for host managed-node2 15980 1727204169.72649: done getting next task for host managed-node2 15980 1727204169.72653: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15980 1727204169.72655: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204169.72677: getting variables 15980 1727204169.72678: in VariableManager get_vars() 15980 1727204169.72718: Calling all_inventory to load vars for managed-node2 15980 1727204169.72721: Calling groups_inventory to load vars for managed-node2 15980 1727204169.72723: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204169.72737: Calling all_plugins_play to load vars for managed-node2 15980 1727204169.72740: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204169.72744: Calling groups_plugins_play to load vars for managed-node2 15980 1727204169.73375: WORKER PROCESS EXITING 15980 1727204169.75937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204169.78491: done with get_vars() 15980 1727204169.78527: done getting variables 15980 1727204169.78602: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:56:09 -0400 (0:00:00.079) 0:00:31.196 ***** 15980 1727204169.78638: entering _queue_task() for managed-node2/service 15980 1727204169.79023: worker is 1 (out of 1 available) 15980 1727204169.79038: exiting _queue_task() for managed-node2/service 15980 1727204169.79052: done queuing things up, now waiting for results queue to drain 15980 1727204169.79054: waiting for pending results... 15980 1727204169.79363: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15980 1727204169.79576: in run() - task 127b8e07-fff9-5f1d-4b72-000000000045 15980 1727204169.79603: variable 'ansible_search_path' from source: unknown 15980 1727204169.79609: variable 'ansible_search_path' from source: unknown 15980 1727204169.79649: calling self._execute() 15980 1727204169.79974: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204169.79979: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204169.79982: variable 'omit' from source: magic vars 15980 1727204169.80739: variable 'ansible_distribution_major_version' from source: facts 15980 1727204169.80744: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204169.80969: variable '__network_wireless_connections_defined' from source: role '' defaults 15980 1727204169.81483: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15980 1727204169.85858: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15980 1727204169.86172: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15980 1727204169.86177: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15980 1727204169.86422: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15980 1727204169.86450: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15980 1727204169.86676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204169.86845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204169.86876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204169.86918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204169.86932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204169.87111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204169.87137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204169.87287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204169.87374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204169.87379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204169.87413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204169.87437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204169.87461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204169.87620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204169.87672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204169.88162: variable 'network_connections' from source: play vars 15980 1727204169.88182: variable 'profile' from source: play vars 15980 1727204169.88285: variable 'profile' from source: play vars 15980 1727204169.88292: variable 'interface' from source: set_fact 15980 1727204169.88375: variable 'interface' from source: set_fact 15980 1727204169.88552: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15980 1727204169.88921: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15980 1727204169.88962: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15980 1727204169.89024: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15980 1727204169.89041: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15980 1727204169.89302: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15980 1727204169.89305: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15980 1727204169.89307: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204169.89315: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15980 1727204169.89318: variable '__network_team_connections_defined' from source: role '' defaults 15980 1727204169.89733: variable 'network_connections' from source: play vars 15980 1727204169.89740: variable 'profile' from source: play vars 15980 1727204169.89828: variable 'profile' from source: play vars 15980 1727204169.89831: variable 'interface' from source: set_fact 15980 1727204169.89897: variable 'interface' from source: set_fact 15980 1727204169.89927: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15980 1727204169.89933: when evaluation is False, skipping this task 15980 1727204169.89936: _execute() done 15980 1727204169.89939: dumping result to json 15980 1727204169.89944: done dumping result, returning 15980 1727204169.89953: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-5f1d-4b72-000000000045] 15980 1727204169.89963: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000045 15980 1727204169.90070: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000045 15980 1727204169.90072: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15980 1727204169.90155: no more pending results, returning what we have 15980 1727204169.90161: results queue empty 15980 1727204169.90162: checking for any_errors_fatal 15980 1727204169.90174: done checking for any_errors_fatal 15980 1727204169.90175: checking for max_fail_percentage 15980 1727204169.90177: done checking for max_fail_percentage 15980 1727204169.90178: checking to see if all hosts have failed and the running result is not ok 15980 1727204169.90179: done checking to see if all hosts have failed 15980 1727204169.90180: getting the remaining hosts for this loop 15980 1727204169.90182: done getting the remaining hosts for this loop 15980 1727204169.90187: getting the next task for host managed-node2 15980 1727204169.90195: done getting next task for host managed-node2 15980 1727204169.90200: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15980 1727204169.90202: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204169.90220: getting variables 15980 1727204169.90222: in VariableManager get_vars() 15980 1727204169.90371: Calling all_inventory to load vars for managed-node2 15980 1727204169.90375: Calling groups_inventory to load vars for managed-node2 15980 1727204169.90383: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204169.90395: Calling all_plugins_play to load vars for managed-node2 15980 1727204169.90399: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204169.90404: Calling groups_plugins_play to load vars for managed-node2 15980 1727204169.92856: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204169.95159: done with get_vars() 15980 1727204169.95205: done getting variables 15980 1727204169.95276: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:56:09 -0400 (0:00:00.166) 0:00:31.363 ***** 15980 1727204169.95314: entering _queue_task() for managed-node2/service 15980 1727204169.95907: worker is 1 (out of 1 available) 15980 1727204169.95919: exiting _queue_task() for managed-node2/service 15980 1727204169.95930: done queuing things up, now waiting for results queue to drain 15980 1727204169.95932: waiting for pending results... 15980 1727204169.96184: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15980 1727204169.96193: in run() - task 127b8e07-fff9-5f1d-4b72-000000000046 15980 1727204169.96210: variable 'ansible_search_path' from source: unknown 15980 1727204169.96214: variable 'ansible_search_path' from source: unknown 15980 1727204169.96256: calling self._execute() 15980 1727204169.96474: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204169.96478: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204169.96481: variable 'omit' from source: magic vars 15980 1727204169.96843: variable 'ansible_distribution_major_version' from source: facts 15980 1727204169.96856: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204169.97052: variable 'network_provider' from source: set_fact 15980 1727204169.97056: variable 'network_state' from source: role '' defaults 15980 1727204169.97069: Evaluated conditional (network_provider == "nm" or network_state != {}): True 15980 1727204169.97077: variable 'omit' from source: magic vars 15980 1727204169.97119: variable 'omit' from source: magic vars 15980 1727204169.97157: variable 'network_service_name' from source: role '' defaults 15980 1727204169.97257: variable 'network_service_name' from source: role '' defaults 15980 1727204169.97430: variable '__network_provider_setup' from source: role '' defaults 15980 1727204169.97434: variable '__network_service_name_default_nm' from source: role '' defaults 15980 1727204169.97546: variable '__network_service_name_default_nm' from source: role '' defaults 15980 1727204169.97551: variable '__network_packages_default_nm' from source: role '' defaults 15980 1727204169.97644: variable '__network_packages_default_nm' from source: role '' defaults 15980 1727204169.97996: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15980 1727204170.01376: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15980 1727204170.01381: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15980 1727204170.01391: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15980 1727204170.01435: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15980 1727204170.01471: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15980 1727204170.01572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204170.01613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204170.01650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204170.01703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204170.01727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204170.01789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204170.01913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204170.01979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204170.02185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204170.02190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204170.02483: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15980 1727204170.02694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204170.02730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204170.02764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204170.02817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204170.02842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204170.02954: variable 'ansible_python' from source: facts 15980 1727204170.02989: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15980 1727204170.03181: variable '__network_wpa_supplicant_required' from source: role '' defaults 15980 1727204170.03282: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15980 1727204170.03426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204170.03460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204170.03496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204170.03622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204170.03626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204170.03649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204170.03718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204170.03798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204170.03912: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204170.03959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204170.04281: variable 'network_connections' from source: play vars 15980 1727204170.04299: variable 'profile' from source: play vars 15980 1727204170.04303: variable 'profile' from source: play vars 15980 1727204170.04305: variable 'interface' from source: set_fact 15980 1727204170.04372: variable 'interface' from source: set_fact 15980 1727204170.04523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15980 1727204170.04932: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15980 1727204170.05016: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15980 1727204170.05137: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15980 1727204170.05214: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15980 1727204170.05296: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15980 1727204170.05372: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15980 1727204170.05479: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204170.05588: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15980 1727204170.05692: variable '__network_wireless_connections_defined' from source: role '' defaults 15980 1727204170.06437: variable 'network_connections' from source: play vars 15980 1727204170.06475: variable 'profile' from source: play vars 15980 1727204170.06656: variable 'profile' from source: play vars 15980 1727204170.06692: variable 'interface' from source: set_fact 15980 1727204170.06900: variable 'interface' from source: set_fact 15980 1727204170.06962: variable '__network_packages_default_wireless' from source: role '' defaults 15980 1727204170.07065: variable '__network_wireless_connections_defined' from source: role '' defaults 15980 1727204170.07449: variable 'network_connections' from source: play vars 15980 1727204170.07460: variable 'profile' from source: play vars 15980 1727204170.07541: variable 'profile' from source: play vars 15980 1727204170.07545: variable 'interface' from source: set_fact 15980 1727204170.07633: variable 'interface' from source: set_fact 15980 1727204170.07729: variable '__network_packages_default_team' from source: role '' defaults 15980 1727204170.07760: variable '__network_team_connections_defined' from source: role '' defaults 15980 1727204170.08096: variable 'network_connections' from source: play vars 15980 1727204170.08099: variable 'profile' from source: play vars 15980 1727204170.08176: variable 'profile' from source: play vars 15980 1727204170.08180: variable 'interface' from source: set_fact 15980 1727204170.08364: variable 'interface' from source: set_fact 15980 1727204170.08371: variable '__network_service_name_default_initscripts' from source: role '' defaults 15980 1727204170.08400: variable '__network_service_name_default_initscripts' from source: role '' defaults 15980 1727204170.08442: variable '__network_packages_default_initscripts' from source: role '' defaults 15980 1727204170.08480: variable '__network_packages_default_initscripts' from source: role '' defaults 15980 1727204170.08872: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15980 1727204170.09428: variable 'network_connections' from source: play vars 15980 1727204170.09437: variable 'profile' from source: play vars 15980 1727204170.09521: variable 'profile' from source: play vars 15980 1727204170.09524: variable 'interface' from source: set_fact 15980 1727204170.09602: variable 'interface' from source: set_fact 15980 1727204170.09621: variable 'ansible_distribution' from source: facts 15980 1727204170.09624: variable '__network_rh_distros' from source: role '' defaults 15980 1727204170.09634: variable 'ansible_distribution_major_version' from source: facts 15980 1727204170.09651: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15980 1727204170.09970: variable 'ansible_distribution' from source: facts 15980 1727204170.09974: variable '__network_rh_distros' from source: role '' defaults 15980 1727204170.09976: variable 'ansible_distribution_major_version' from source: facts 15980 1727204170.09979: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15980 1727204170.10087: variable 'ansible_distribution' from source: facts 15980 1727204170.10091: variable '__network_rh_distros' from source: role '' defaults 15980 1727204170.10097: variable 'ansible_distribution_major_version' from source: facts 15980 1727204170.10187: variable 'network_provider' from source: set_fact 15980 1727204170.10191: variable 'omit' from source: magic vars 15980 1727204170.10215: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204170.10239: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204170.10258: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204170.10278: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204170.10287: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204170.10312: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204170.10315: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204170.10320: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204170.10397: Set connection var ansible_connection to ssh 15980 1727204170.10403: Set connection var ansible_pipelining to False 15980 1727204170.10410: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204170.10416: Set connection var ansible_timeout to 10 15980 1727204170.10422: Set connection var ansible_shell_type to sh 15980 1727204170.10427: Set connection var ansible_shell_executable to /bin/sh 15980 1727204170.10452: variable 'ansible_shell_executable' from source: unknown 15980 1727204170.10455: variable 'ansible_connection' from source: unknown 15980 1727204170.10458: variable 'ansible_module_compression' from source: unknown 15980 1727204170.10460: variable 'ansible_shell_type' from source: unknown 15980 1727204170.10463: variable 'ansible_shell_executable' from source: unknown 15980 1727204170.10468: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204170.10475: variable 'ansible_pipelining' from source: unknown 15980 1727204170.10479: variable 'ansible_timeout' from source: unknown 15980 1727204170.10481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204170.10582: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15980 1727204170.10593: variable 'omit' from source: magic vars 15980 1727204170.10598: starting attempt loop 15980 1727204170.10603: running the handler 15980 1727204170.10670: variable 'ansible_facts' from source: unknown 15980 1727204170.11314: _low_level_execute_command(): starting 15980 1727204170.11319: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15980 1727204170.11948: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204170.11959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204170.11964: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204170.12053: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204170.12056: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204170.12131: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204170.13935: stdout chunk (state=3): >>>/root <<< 15980 1727204170.14096: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204170.14111: stderr chunk (state=3): >>><<< 15980 1727204170.14118: stdout chunk (state=3): >>><<< 15980 1727204170.14139: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204170.14150: _low_level_execute_command(): starting 15980 1727204170.14157: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204170.1413956-18382-141298961232438 `" && echo ansible-tmp-1727204170.1413956-18382-141298961232438="` echo /root/.ansible/tmp/ansible-tmp-1727204170.1413956-18382-141298961232438 `" ) && sleep 0' 15980 1727204170.14718: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204170.14721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204170.14724: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 15980 1727204170.14729: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204170.14790: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204170.14793: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204170.14798: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204170.14899: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204170.16958: stdout chunk (state=3): >>>ansible-tmp-1727204170.1413956-18382-141298961232438=/root/.ansible/tmp/ansible-tmp-1727204170.1413956-18382-141298961232438 <<< 15980 1727204170.17063: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204170.17125: stderr chunk (state=3): >>><<< 15980 1727204170.17129: stdout chunk (state=3): >>><<< 15980 1727204170.17146: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204170.1413956-18382-141298961232438=/root/.ansible/tmp/ansible-tmp-1727204170.1413956-18382-141298961232438 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204170.17180: variable 'ansible_module_compression' from source: unknown 15980 1727204170.17227: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15980vtkmnsww/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 15980 1727204170.17340: variable 'ansible_facts' from source: unknown 15980 1727204170.17511: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204170.1413956-18382-141298961232438/AnsiballZ_systemd.py 15980 1727204170.17647: Sending initial data 15980 1727204170.17651: Sent initial data (156 bytes) 15980 1727204170.18164: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204170.18171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204170.18174: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204170.18176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204170.18229: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204170.18232: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204170.18310: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204170.20023: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15980 1727204170.20100: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15980 1727204170.20184: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15980vtkmnsww/tmp6inqgyv6 /root/.ansible/tmp/ansible-tmp-1727204170.1413956-18382-141298961232438/AnsiballZ_systemd.py <<< 15980 1727204170.20187: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204170.1413956-18382-141298961232438/AnsiballZ_systemd.py" <<< 15980 1727204170.20259: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15980vtkmnsww/tmp6inqgyv6" to remote "/root/.ansible/tmp/ansible-tmp-1727204170.1413956-18382-141298961232438/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204170.1413956-18382-141298961232438/AnsiballZ_systemd.py" <<< 15980 1727204170.21918: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204170.21998: stderr chunk (state=3): >>><<< 15980 1727204170.22005: stdout chunk (state=3): >>><<< 15980 1727204170.22023: done transferring module to remote 15980 1727204170.22034: _low_level_execute_command(): starting 15980 1727204170.22038: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204170.1413956-18382-141298961232438/ /root/.ansible/tmp/ansible-tmp-1727204170.1413956-18382-141298961232438/AnsiballZ_systemd.py && sleep 0' 15980 1727204170.22946: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204170.22951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204170.22953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 15980 1727204170.22956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204170.23010: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204170.23014: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204170.23017: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204170.23151: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204170.25088: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204170.25132: stderr chunk (state=3): >>><<< 15980 1727204170.25141: stdout chunk (state=3): >>><<< 15980 1727204170.25156: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204170.25160: _low_level_execute_command(): starting 15980 1727204170.25163: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204170.1413956-18382-141298961232438/AnsiballZ_systemd.py && sleep 0' 15980 1727204170.25641: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204170.25645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 15980 1727204170.25671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204170.25675: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204170.25691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204170.25740: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204170.25743: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204170.25824: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204170.58337: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3396", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ExecMainStartTimestampMonotonic": "260891109", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3396", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5305", "MemoryCurrent": "4538368", "MemoryPeak": "6385664", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3524354048", "CPUUsageNSec": "776526000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCO<<< 15980 1727204170.58345: stdout chunk (state=3): >>>RE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target multi-user.target network.service shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service dbus.socket sysinit.target basic.target cloud-init-local.service system.slice", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:55 EDT", "StateChangeTimestampMonotonic": "382184288", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveExitTimestampMonotonic": "260891359", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveEnterTimestampMonotonic": "260977925", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveExitTimestampMonotonic": "260855945", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveEnterTimestampMonotonic": "260882383", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ConditionTimestampMonotonic": "260884363", "AssertTimestamp": "Tue 2024-09-24 14:51:54 EDT", "AssertTimestampMonotonic": "260884375", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "16e01f988f204339aa5cb89910a771d1", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 15980 1727204170.60338: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 15980 1727204170.60398: stderr chunk (state=3): >>><<< 15980 1727204170.60402: stdout chunk (state=3): >>><<< 15980 1727204170.60417: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3396", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ExecMainStartTimestampMonotonic": "260891109", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3396", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5305", "MemoryCurrent": "4538368", "MemoryPeak": "6385664", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3524354048", "CPUUsageNSec": "776526000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target multi-user.target network.service shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service dbus.socket sysinit.target basic.target cloud-init-local.service system.slice", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:55 EDT", "StateChangeTimestampMonotonic": "382184288", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveExitTimestampMonotonic": "260891359", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveEnterTimestampMonotonic": "260977925", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveExitTimestampMonotonic": "260855945", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveEnterTimestampMonotonic": "260882383", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ConditionTimestampMonotonic": "260884363", "AssertTimestamp": "Tue 2024-09-24 14:51:54 EDT", "AssertTimestampMonotonic": "260884375", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "16e01f988f204339aa5cb89910a771d1", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 15980 1727204170.60549: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204170.1413956-18382-141298961232438/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15980 1727204170.60567: _low_level_execute_command(): starting 15980 1727204170.60573: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204170.1413956-18382-141298961232438/ > /dev/null 2>&1 && sleep 0' 15980 1727204170.61034: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204170.61073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15980 1727204170.61077: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 15980 1727204170.61080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15980 1727204170.61082: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204170.61084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204170.61134: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204170.61138: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204170.61140: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204170.61216: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204170.63149: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204170.63210: stderr chunk (state=3): >>><<< 15980 1727204170.63213: stdout chunk (state=3): >>><<< 15980 1727204170.63226: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204170.63236: handler run complete 15980 1727204170.63284: attempt loop complete, returning result 15980 1727204170.63288: _execute() done 15980 1727204170.63290: dumping result to json 15980 1727204170.63302: done dumping result, returning 15980 1727204170.63311: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [127b8e07-fff9-5f1d-4b72-000000000046] 15980 1727204170.63316: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000046 15980 1727204170.63528: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000046 15980 1727204170.63531: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15980 1727204170.63585: no more pending results, returning what we have 15980 1727204170.63588: results queue empty 15980 1727204170.63589: checking for any_errors_fatal 15980 1727204170.63597: done checking for any_errors_fatal 15980 1727204170.63598: checking for max_fail_percentage 15980 1727204170.63600: done checking for max_fail_percentage 15980 1727204170.63600: checking to see if all hosts have failed and the running result is not ok 15980 1727204170.63601: done checking to see if all hosts have failed 15980 1727204170.63602: getting the remaining hosts for this loop 15980 1727204170.63604: done getting the remaining hosts for this loop 15980 1727204170.63610: getting the next task for host managed-node2 15980 1727204170.63616: done getting next task for host managed-node2 15980 1727204170.63621: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15980 1727204170.63623: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204170.63634: getting variables 15980 1727204170.63636: in VariableManager get_vars() 15980 1727204170.63676: Calling all_inventory to load vars for managed-node2 15980 1727204170.63707: Calling groups_inventory to load vars for managed-node2 15980 1727204170.63710: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204170.63722: Calling all_plugins_play to load vars for managed-node2 15980 1727204170.63724: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204170.63727: Calling groups_plugins_play to load vars for managed-node2 15980 1727204170.64873: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204170.66641: done with get_vars() 15980 1727204170.66690: done getting variables 15980 1727204170.66762: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:56:10 -0400 (0:00:00.714) 0:00:32.078 ***** 15980 1727204170.66803: entering _queue_task() for managed-node2/service 15980 1727204170.67187: worker is 1 (out of 1 available) 15980 1727204170.67202: exiting _queue_task() for managed-node2/service 15980 1727204170.67219: done queuing things up, now waiting for results queue to drain 15980 1727204170.67222: waiting for pending results... 15980 1727204170.67638: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15980 1727204170.67735: in run() - task 127b8e07-fff9-5f1d-4b72-000000000047 15980 1727204170.67743: variable 'ansible_search_path' from source: unknown 15980 1727204170.67746: variable 'ansible_search_path' from source: unknown 15980 1727204170.67749: calling self._execute() 15980 1727204170.67902: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204170.67907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204170.67911: variable 'omit' from source: magic vars 15980 1727204170.68359: variable 'ansible_distribution_major_version' from source: facts 15980 1727204170.68408: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204170.68515: variable 'network_provider' from source: set_fact 15980 1727204170.68518: Evaluated conditional (network_provider == "nm"): True 15980 1727204170.68649: variable '__network_wpa_supplicant_required' from source: role '' defaults 15980 1727204170.68742: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15980 1727204170.68919: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15980 1727204170.70903: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15980 1727204170.70973: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15980 1727204170.71078: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15980 1727204170.71082: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15980 1727204170.71105: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15980 1727204170.71223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204170.71280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204170.71444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204170.71447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204170.71450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204170.71453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204170.71478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204170.71507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204170.71557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204170.71588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204170.71641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204170.71677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204170.71709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204170.71770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204170.71779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204170.71990: variable 'network_connections' from source: play vars 15980 1727204170.71994: variable 'profile' from source: play vars 15980 1727204170.72066: variable 'profile' from source: play vars 15980 1727204170.72070: variable 'interface' from source: set_fact 15980 1727204170.72154: variable 'interface' from source: set_fact 15980 1727204170.72264: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15980 1727204170.72493: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15980 1727204170.72548: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15980 1727204170.72613: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15980 1727204170.72659: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15980 1727204170.72734: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15980 1727204170.72762: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15980 1727204170.72811: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204170.72843: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15980 1727204170.72919: variable '__network_wireless_connections_defined' from source: role '' defaults 15980 1727204170.73322: variable 'network_connections' from source: play vars 15980 1727204170.73332: variable 'profile' from source: play vars 15980 1727204170.73460: variable 'profile' from source: play vars 15980 1727204170.73468: variable 'interface' from source: set_fact 15980 1727204170.73571: variable 'interface' from source: set_fact 15980 1727204170.73607: Evaluated conditional (__network_wpa_supplicant_required): False 15980 1727204170.73611: when evaluation is False, skipping this task 15980 1727204170.73614: _execute() done 15980 1727204170.73625: dumping result to json 15980 1727204170.73633: done dumping result, returning 15980 1727204170.73636: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [127b8e07-fff9-5f1d-4b72-000000000047] 15980 1727204170.73638: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000047 15980 1727204170.73797: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000047 15980 1727204170.73804: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 15980 1727204170.74036: no more pending results, returning what we have 15980 1727204170.74039: results queue empty 15980 1727204170.74040: checking for any_errors_fatal 15980 1727204170.74061: done checking for any_errors_fatal 15980 1727204170.74062: checking for max_fail_percentage 15980 1727204170.74064: done checking for max_fail_percentage 15980 1727204170.74065: checking to see if all hosts have failed and the running result is not ok 15980 1727204170.74067: done checking to see if all hosts have failed 15980 1727204170.74075: getting the remaining hosts for this loop 15980 1727204170.74079: done getting the remaining hosts for this loop 15980 1727204170.74096: getting the next task for host managed-node2 15980 1727204170.74107: done getting next task for host managed-node2 15980 1727204170.74111: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 15980 1727204170.74114: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204170.74138: getting variables 15980 1727204170.74143: in VariableManager get_vars() 15980 1727204170.74195: Calling all_inventory to load vars for managed-node2 15980 1727204170.74200: Calling groups_inventory to load vars for managed-node2 15980 1727204170.74203: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204170.74214: Calling all_plugins_play to load vars for managed-node2 15980 1727204170.74216: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204170.74222: Calling groups_plugins_play to load vars for managed-node2 15980 1727204170.76229: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204170.85751: done with get_vars() 15980 1727204170.85799: done getting variables 15980 1727204170.86217: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:56:10 -0400 (0:00:00.194) 0:00:32.272 ***** 15980 1727204170.86250: entering _queue_task() for managed-node2/service 15980 1727204170.87010: worker is 1 (out of 1 available) 15980 1727204170.87024: exiting _queue_task() for managed-node2/service 15980 1727204170.87037: done queuing things up, now waiting for results queue to drain 15980 1727204170.87039: waiting for pending results... 15980 1727204170.87603: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 15980 1727204170.87610: in run() - task 127b8e07-fff9-5f1d-4b72-000000000048 15980 1727204170.87613: variable 'ansible_search_path' from source: unknown 15980 1727204170.87616: variable 'ansible_search_path' from source: unknown 15980 1727204170.87682: calling self._execute() 15980 1727204170.87914: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204170.87920: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204170.87924: variable 'omit' from source: magic vars 15980 1727204170.88549: variable 'ansible_distribution_major_version' from source: facts 15980 1727204170.88583: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204170.88730: variable 'network_provider' from source: set_fact 15980 1727204170.88748: Evaluated conditional (network_provider == "initscripts"): False 15980 1727204170.88759: when evaluation is False, skipping this task 15980 1727204170.88771: _execute() done 15980 1727204170.88788: dumping result to json 15980 1727204170.88798: done dumping result, returning 15980 1727204170.88812: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [127b8e07-fff9-5f1d-4b72-000000000048] 15980 1727204170.88894: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000048 15980 1727204170.88989: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000048 15980 1727204170.89108: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15980 1727204170.89164: no more pending results, returning what we have 15980 1727204170.89170: results queue empty 15980 1727204170.89172: checking for any_errors_fatal 15980 1727204170.89186: done checking for any_errors_fatal 15980 1727204170.89187: checking for max_fail_percentage 15980 1727204170.89189: done checking for max_fail_percentage 15980 1727204170.89190: checking to see if all hosts have failed and the running result is not ok 15980 1727204170.89191: done checking to see if all hosts have failed 15980 1727204170.89192: getting the remaining hosts for this loop 15980 1727204170.89194: done getting the remaining hosts for this loop 15980 1727204170.89199: getting the next task for host managed-node2 15980 1727204170.89207: done getting next task for host managed-node2 15980 1727204170.89219: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15980 1727204170.89222: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204170.89244: getting variables 15980 1727204170.89246: in VariableManager get_vars() 15980 1727204170.89303: Calling all_inventory to load vars for managed-node2 15980 1727204170.89306: Calling groups_inventory to load vars for managed-node2 15980 1727204170.89309: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204170.89376: Calling all_plugins_play to load vars for managed-node2 15980 1727204170.89381: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204170.89386: Calling groups_plugins_play to load vars for managed-node2 15980 1727204170.91993: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204170.94585: done with get_vars() 15980 1727204170.94630: done getting variables 15980 1727204170.94702: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:56:10 -0400 (0:00:00.084) 0:00:32.357 ***** 15980 1727204170.94743: entering _queue_task() for managed-node2/copy 15980 1727204170.95161: worker is 1 (out of 1 available) 15980 1727204170.95282: exiting _queue_task() for managed-node2/copy 15980 1727204170.95294: done queuing things up, now waiting for results queue to drain 15980 1727204170.95297: waiting for pending results... 15980 1727204170.95592: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15980 1727204170.95688: in run() - task 127b8e07-fff9-5f1d-4b72-000000000049 15980 1727204170.95713: variable 'ansible_search_path' from source: unknown 15980 1727204170.95722: variable 'ansible_search_path' from source: unknown 15980 1727204170.95773: calling self._execute() 15980 1727204170.95897: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204170.95914: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204170.95932: variable 'omit' from source: magic vars 15980 1727204170.96397: variable 'ansible_distribution_major_version' from source: facts 15980 1727204170.96452: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204170.96567: variable 'network_provider' from source: set_fact 15980 1727204170.96580: Evaluated conditional (network_provider == "initscripts"): False 15980 1727204170.96590: when evaluation is False, skipping this task 15980 1727204170.96598: _execute() done 15980 1727204170.96606: dumping result to json 15980 1727204170.96671: done dumping result, returning 15980 1727204170.96679: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [127b8e07-fff9-5f1d-4b72-000000000049] 15980 1727204170.96683: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000049 skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 15980 1727204170.96922: no more pending results, returning what we have 15980 1727204170.96929: results queue empty 15980 1727204170.96931: checking for any_errors_fatal 15980 1727204170.96940: done checking for any_errors_fatal 15980 1727204170.96941: checking for max_fail_percentage 15980 1727204170.96943: done checking for max_fail_percentage 15980 1727204170.96944: checking to see if all hosts have failed and the running result is not ok 15980 1727204170.96945: done checking to see if all hosts have failed 15980 1727204170.96946: getting the remaining hosts for this loop 15980 1727204170.96947: done getting the remaining hosts for this loop 15980 1727204170.96952: getting the next task for host managed-node2 15980 1727204170.96958: done getting next task for host managed-node2 15980 1727204170.96963: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15980 1727204170.96967: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204170.96984: getting variables 15980 1727204170.96986: in VariableManager get_vars() 15980 1727204170.97036: Calling all_inventory to load vars for managed-node2 15980 1727204170.97040: Calling groups_inventory to load vars for managed-node2 15980 1727204170.97042: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204170.97059: Calling all_plugins_play to load vars for managed-node2 15980 1727204170.97063: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204170.97271: Calling groups_plugins_play to load vars for managed-node2 15980 1727204170.98101: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000049 15980 1727204170.98106: WORKER PROCESS EXITING 15980 1727204171.01069: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204171.03256: done with get_vars() 15980 1727204171.03296: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:56:11 -0400 (0:00:00.086) 0:00:32.444 ***** 15980 1727204171.03395: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 15980 1727204171.03787: worker is 1 (out of 1 available) 15980 1727204171.03801: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 15980 1727204171.03814: done queuing things up, now waiting for results queue to drain 15980 1727204171.03816: waiting for pending results... 15980 1727204171.04137: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15980 1727204171.04282: in run() - task 127b8e07-fff9-5f1d-4b72-00000000004a 15980 1727204171.04309: variable 'ansible_search_path' from source: unknown 15980 1727204171.04316: variable 'ansible_search_path' from source: unknown 15980 1727204171.04362: calling self._execute() 15980 1727204171.04484: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204171.04500: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204171.04520: variable 'omit' from source: magic vars 15980 1727204171.04980: variable 'ansible_distribution_major_version' from source: facts 15980 1727204171.05000: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204171.05014: variable 'omit' from source: magic vars 15980 1727204171.05074: variable 'omit' from source: magic vars 15980 1727204171.05266: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15980 1727204171.07720: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15980 1727204171.07816: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15980 1727204171.07869: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15980 1727204171.07912: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15980 1727204171.07950: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15980 1727204171.08059: variable 'network_provider' from source: set_fact 15980 1727204171.08219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204171.08263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204171.08471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204171.08475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204171.08478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204171.08480: variable 'omit' from source: magic vars 15980 1727204171.08602: variable 'omit' from source: magic vars 15980 1727204171.08727: variable 'network_connections' from source: play vars 15980 1727204171.08746: variable 'profile' from source: play vars 15980 1727204171.08822: variable 'profile' from source: play vars 15980 1727204171.08838: variable 'interface' from source: set_fact 15980 1727204171.08906: variable 'interface' from source: set_fact 15980 1727204171.09075: variable 'omit' from source: magic vars 15980 1727204171.09092: variable '__lsr_ansible_managed' from source: task vars 15980 1727204171.09170: variable '__lsr_ansible_managed' from source: task vars 15980 1727204171.09390: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 15980 1727204171.09663: Loaded config def from plugin (lookup/template) 15980 1727204171.09679: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 15980 1727204171.09719: File lookup term: get_ansible_managed.j2 15980 1727204171.09730: variable 'ansible_search_path' from source: unknown 15980 1727204171.09742: evaluation_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 15980 1727204171.09761: search_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 15980 1727204171.09788: variable 'ansible_search_path' from source: unknown 15980 1727204171.24274: variable 'ansible_managed' from source: unknown 15980 1727204171.24392: variable 'omit' from source: magic vars 15980 1727204171.24440: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204171.24479: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204171.24507: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204171.24537: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204171.24552: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204171.24588: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204171.24596: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204171.24603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204171.24711: Set connection var ansible_connection to ssh 15980 1727204171.24729: Set connection var ansible_pipelining to False 15980 1727204171.24744: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204171.24753: Set connection var ansible_timeout to 10 15980 1727204171.24762: Set connection var ansible_shell_type to sh 15980 1727204171.24845: Set connection var ansible_shell_executable to /bin/sh 15980 1727204171.24848: variable 'ansible_shell_executable' from source: unknown 15980 1727204171.24851: variable 'ansible_connection' from source: unknown 15980 1727204171.24853: variable 'ansible_module_compression' from source: unknown 15980 1727204171.24856: variable 'ansible_shell_type' from source: unknown 15980 1727204171.24858: variable 'ansible_shell_executable' from source: unknown 15980 1727204171.24860: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204171.24862: variable 'ansible_pipelining' from source: unknown 15980 1727204171.24864: variable 'ansible_timeout' from source: unknown 15980 1727204171.24868: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204171.25002: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15980 1727204171.25029: variable 'omit' from source: magic vars 15980 1727204171.25042: starting attempt loop 15980 1727204171.25049: running the handler 15980 1727204171.25070: _low_level_execute_command(): starting 15980 1727204171.25081: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15980 1727204171.25892: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204171.25928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204171.26005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204171.26036: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204171.26049: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204171.26168: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204171.27956: stdout chunk (state=3): >>>/root <<< 15980 1727204171.28087: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204171.28493: stderr chunk (state=3): >>><<< 15980 1727204171.28497: stdout chunk (state=3): >>><<< 15980 1727204171.28500: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204171.28502: _low_level_execute_command(): starting 15980 1727204171.28505: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204171.283899-18551-197922768649989 `" && echo ansible-tmp-1727204171.283899-18551-197922768649989="` echo /root/.ansible/tmp/ansible-tmp-1727204171.283899-18551-197922768649989 `" ) && sleep 0' 15980 1727204171.29732: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204171.29752: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204171.29887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204171.29913: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204171.30024: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204171.32030: stdout chunk (state=3): >>>ansible-tmp-1727204171.283899-18551-197922768649989=/root/.ansible/tmp/ansible-tmp-1727204171.283899-18551-197922768649989 <<< 15980 1727204171.32287: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204171.32298: stdout chunk (state=3): >>><<< 15980 1727204171.32575: stderr chunk (state=3): >>><<< 15980 1727204171.32579: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204171.283899-18551-197922768649989=/root/.ansible/tmp/ansible-tmp-1727204171.283899-18551-197922768649989 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204171.32581: variable 'ansible_module_compression' from source: unknown 15980 1727204171.32583: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15980vtkmnsww/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 15980 1727204171.32585: variable 'ansible_facts' from source: unknown 15980 1727204171.32822: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204171.283899-18551-197922768649989/AnsiballZ_network_connections.py 15980 1727204171.33324: Sending initial data 15980 1727204171.33332: Sent initial data (167 bytes) 15980 1727204171.34539: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204171.34544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15980 1727204171.34579: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204171.34683: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204171.34719: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204171.34734: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204171.34752: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204171.34889: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204171.36613: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15980 1727204171.36677: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15980 1727204171.36751: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15980vtkmnsww/tmpx821pzdf /root/.ansible/tmp/ansible-tmp-1727204171.283899-18551-197922768649989/AnsiballZ_network_connections.py <<< 15980 1727204171.36755: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204171.283899-18551-197922768649989/AnsiballZ_network_connections.py" <<< 15980 1727204171.37027: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15980vtkmnsww/tmpx821pzdf" to remote "/root/.ansible/tmp/ansible-tmp-1727204171.283899-18551-197922768649989/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204171.283899-18551-197922768649989/AnsiballZ_network_connections.py" <<< 15980 1727204171.38869: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204171.39061: stderr chunk (state=3): >>><<< 15980 1727204171.39066: stdout chunk (state=3): >>><<< 15980 1727204171.39098: done transferring module to remote 15980 1727204171.39110: _low_level_execute_command(): starting 15980 1727204171.39172: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204171.283899-18551-197922768649989/ /root/.ansible/tmp/ansible-tmp-1727204171.283899-18551-197922768649989/AnsiballZ_network_connections.py && sleep 0' 15980 1727204171.40423: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204171.40431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204171.40477: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204171.40481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204171.40695: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204171.40699: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204171.40727: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204171.40803: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204171.42788: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204171.42793: stdout chunk (state=3): >>><<< 15980 1727204171.42796: stderr chunk (state=3): >>><<< 15980 1727204171.42819: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204171.42822: _low_level_execute_command(): starting 15980 1727204171.42825: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204171.283899-18551-197922768649989/AnsiballZ_network_connections.py && sleep 0' 15980 1727204171.44232: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204171.44586: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204171.44614: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204171.44702: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204171.76997: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 15980 1727204171.80294: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 15980 1727204171.80298: stdout chunk (state=3): >>><<< 15980 1727204171.80300: stderr chunk (state=3): >>><<< 15980 1727204171.80473: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 15980 1727204171.80477: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'LSR-TST-br31', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204171.283899-18551-197922768649989/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15980 1727204171.80480: _low_level_execute_command(): starting 15980 1727204171.80483: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204171.283899-18551-197922768649989/ > /dev/null 2>&1 && sleep 0' 15980 1727204171.81708: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204171.81829: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204171.82033: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204171.84098: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204171.84111: stdout chunk (state=3): >>><<< 15980 1727204171.84122: stderr chunk (state=3): >>><<< 15980 1727204171.84144: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204171.84157: handler run complete 15980 1727204171.84199: attempt loop complete, returning result 15980 1727204171.84348: _execute() done 15980 1727204171.84356: dumping result to json 15980 1727204171.84369: done dumping result, returning 15980 1727204171.84384: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [127b8e07-fff9-5f1d-4b72-00000000004a] 15980 1727204171.84393: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000004a changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "LSR-TST-br31", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 15980 1727204171.84875: no more pending results, returning what we have 15980 1727204171.84878: results queue empty 15980 1727204171.84879: checking for any_errors_fatal 15980 1727204171.84884: done checking for any_errors_fatal 15980 1727204171.84885: checking for max_fail_percentage 15980 1727204171.84886: done checking for max_fail_percentage 15980 1727204171.84887: checking to see if all hosts have failed and the running result is not ok 15980 1727204171.84888: done checking to see if all hosts have failed 15980 1727204171.84889: getting the remaining hosts for this loop 15980 1727204171.84890: done getting the remaining hosts for this loop 15980 1727204171.84894: getting the next task for host managed-node2 15980 1727204171.84900: done getting next task for host managed-node2 15980 1727204171.84904: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 15980 1727204171.84906: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204171.84920: getting variables 15980 1727204171.84921: in VariableManager get_vars() 15980 1727204171.84957: Calling all_inventory to load vars for managed-node2 15980 1727204171.84960: Calling groups_inventory to load vars for managed-node2 15980 1727204171.84962: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204171.85178: Calling all_plugins_play to load vars for managed-node2 15980 1727204171.85183: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204171.85192: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000004a 15980 1727204171.85196: WORKER PROCESS EXITING 15980 1727204171.85201: Calling groups_plugins_play to load vars for managed-node2 15980 1727204171.87879: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204171.93126: done with get_vars() 15980 1727204171.93172: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:56:11 -0400 (0:00:00.899) 0:00:33.343 ***** 15980 1727204171.93368: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 15980 1727204171.94171: worker is 1 (out of 1 available) 15980 1727204171.94185: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 15980 1727204171.94374: done queuing things up, now waiting for results queue to drain 15980 1727204171.94377: waiting for pending results... 15980 1727204171.94703: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 15980 1727204171.95025: in run() - task 127b8e07-fff9-5f1d-4b72-00000000004b 15980 1727204171.95043: variable 'ansible_search_path' from source: unknown 15980 1727204171.95047: variable 'ansible_search_path' from source: unknown 15980 1727204171.95087: calling self._execute() 15980 1727204171.95396: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204171.95402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204171.95414: variable 'omit' from source: magic vars 15980 1727204171.96151: variable 'ansible_distribution_major_version' from source: facts 15980 1727204171.96164: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204171.96618: variable 'network_state' from source: role '' defaults 15980 1727204171.96628: Evaluated conditional (network_state != {}): False 15980 1727204171.96633: when evaluation is False, skipping this task 15980 1727204171.96637: _execute() done 15980 1727204171.96641: dumping result to json 15980 1727204171.96644: done dumping result, returning 15980 1727204171.96657: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [127b8e07-fff9-5f1d-4b72-00000000004b] 15980 1727204171.96660: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000004b 15980 1727204171.96764: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000004b 15980 1727204171.96769: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15980 1727204171.96848: no more pending results, returning what we have 15980 1727204171.96854: results queue empty 15980 1727204171.96855: checking for any_errors_fatal 15980 1727204171.96873: done checking for any_errors_fatal 15980 1727204171.96874: checking for max_fail_percentage 15980 1727204171.96876: done checking for max_fail_percentage 15980 1727204171.96877: checking to see if all hosts have failed and the running result is not ok 15980 1727204171.96878: done checking to see if all hosts have failed 15980 1727204171.96879: getting the remaining hosts for this loop 15980 1727204171.96881: done getting the remaining hosts for this loop 15980 1727204171.96888: getting the next task for host managed-node2 15980 1727204171.96896: done getting next task for host managed-node2 15980 1727204171.96900: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15980 1727204171.96903: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204171.96922: getting variables 15980 1727204171.96924: in VariableManager get_vars() 15980 1727204171.96974: Calling all_inventory to load vars for managed-node2 15980 1727204171.96977: Calling groups_inventory to load vars for managed-node2 15980 1727204171.96980: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204171.96994: Calling all_plugins_play to load vars for managed-node2 15980 1727204171.96997: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204171.96999: Calling groups_plugins_play to load vars for managed-node2 15980 1727204172.00561: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204172.03534: done with get_vars() 15980 1727204172.03578: done getting variables 15980 1727204172.03656: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:56:12 -0400 (0:00:00.103) 0:00:33.447 ***** 15980 1727204172.03694: entering _queue_task() for managed-node2/debug 15980 1727204172.04517: worker is 1 (out of 1 available) 15980 1727204172.04533: exiting _queue_task() for managed-node2/debug 15980 1727204172.04546: done queuing things up, now waiting for results queue to drain 15980 1727204172.04549: waiting for pending results... 15980 1727204172.05358: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15980 1727204172.05721: in run() - task 127b8e07-fff9-5f1d-4b72-00000000004c 15980 1727204172.05726: variable 'ansible_search_path' from source: unknown 15980 1727204172.05730: variable 'ansible_search_path' from source: unknown 15980 1727204172.05745: calling self._execute() 15980 1727204172.05923: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204172.05942: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204172.05959: variable 'omit' from source: magic vars 15980 1727204172.06485: variable 'ansible_distribution_major_version' from source: facts 15980 1727204172.06490: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204172.06505: variable 'omit' from source: magic vars 15980 1727204172.06592: variable 'omit' from source: magic vars 15980 1727204172.06626: variable 'omit' from source: magic vars 15980 1727204172.06678: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204172.06738: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204172.06811: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204172.06814: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204172.06817: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204172.06861: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204172.06875: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204172.06885: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204172.07014: Set connection var ansible_connection to ssh 15980 1727204172.07036: Set connection var ansible_pipelining to False 15980 1727204172.07066: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204172.07073: Set connection var ansible_timeout to 10 15980 1727204172.07135: Set connection var ansible_shell_type to sh 15980 1727204172.07139: Set connection var ansible_shell_executable to /bin/sh 15980 1727204172.07142: variable 'ansible_shell_executable' from source: unknown 15980 1727204172.07144: variable 'ansible_connection' from source: unknown 15980 1727204172.07147: variable 'ansible_module_compression' from source: unknown 15980 1727204172.07149: variable 'ansible_shell_type' from source: unknown 15980 1727204172.07151: variable 'ansible_shell_executable' from source: unknown 15980 1727204172.07158: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204172.07173: variable 'ansible_pipelining' from source: unknown 15980 1727204172.07181: variable 'ansible_timeout' from source: unknown 15980 1727204172.07189: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204172.07364: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15980 1727204172.07461: variable 'omit' from source: magic vars 15980 1727204172.07465: starting attempt loop 15980 1727204172.07470: running the handler 15980 1727204172.07760: variable '__network_connections_result' from source: set_fact 15980 1727204172.08315: handler run complete 15980 1727204172.08319: attempt loop complete, returning result 15980 1727204172.08322: _execute() done 15980 1727204172.08328: dumping result to json 15980 1727204172.08331: done dumping result, returning 15980 1727204172.08333: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [127b8e07-fff9-5f1d-4b72-00000000004c] 15980 1727204172.08335: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000004c ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "" ] } 15980 1727204172.08607: no more pending results, returning what we have 15980 1727204172.08619: results queue empty 15980 1727204172.08621: checking for any_errors_fatal 15980 1727204172.08631: done checking for any_errors_fatal 15980 1727204172.08632: checking for max_fail_percentage 15980 1727204172.08634: done checking for max_fail_percentage 15980 1727204172.08635: checking to see if all hosts have failed and the running result is not ok 15980 1727204172.08635: done checking to see if all hosts have failed 15980 1727204172.08637: getting the remaining hosts for this loop 15980 1727204172.08638: done getting the remaining hosts for this loop 15980 1727204172.08643: getting the next task for host managed-node2 15980 1727204172.08651: done getting next task for host managed-node2 15980 1727204172.08656: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15980 1727204172.08658: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204172.08675: getting variables 15980 1727204172.08677: in VariableManager get_vars() 15980 1727204172.09036: Calling all_inventory to load vars for managed-node2 15980 1727204172.09040: Calling groups_inventory to load vars for managed-node2 15980 1727204172.09043: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204172.09053: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000004c 15980 1727204172.09056: WORKER PROCESS EXITING 15980 1727204172.09070: Calling all_plugins_play to load vars for managed-node2 15980 1727204172.09074: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204172.09078: Calling groups_plugins_play to load vars for managed-node2 15980 1727204172.11958: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204172.15203: done with get_vars() 15980 1727204172.15245: done getting variables 15980 1727204172.15518: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:56:12 -0400 (0:00:00.118) 0:00:33.565 ***** 15980 1727204172.15555: entering _queue_task() for managed-node2/debug 15980 1727204172.16046: worker is 1 (out of 1 available) 15980 1727204172.16061: exiting _queue_task() for managed-node2/debug 15980 1727204172.16080: done queuing things up, now waiting for results queue to drain 15980 1727204172.16082: waiting for pending results... 15980 1727204172.16399: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15980 1727204172.16541: in run() - task 127b8e07-fff9-5f1d-4b72-00000000004d 15980 1727204172.16563: variable 'ansible_search_path' from source: unknown 15980 1727204172.16573: variable 'ansible_search_path' from source: unknown 15980 1727204172.16624: calling self._execute() 15980 1727204172.16809: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204172.16813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204172.16816: variable 'omit' from source: magic vars 15980 1727204172.17243: variable 'ansible_distribution_major_version' from source: facts 15980 1727204172.17263: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204172.17280: variable 'omit' from source: magic vars 15980 1727204172.17333: variable 'omit' from source: magic vars 15980 1727204172.17390: variable 'omit' from source: magic vars 15980 1727204172.17441: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204172.17491: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204172.17567: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204172.17571: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204172.17575: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204172.17609: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204172.17618: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204172.17628: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204172.17747: Set connection var ansible_connection to ssh 15980 1727204172.17761: Set connection var ansible_pipelining to False 15980 1727204172.17774: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204172.17823: Set connection var ansible_timeout to 10 15980 1727204172.17829: Set connection var ansible_shell_type to sh 15980 1727204172.17832: Set connection var ansible_shell_executable to /bin/sh 15980 1727204172.17846: variable 'ansible_shell_executable' from source: unknown 15980 1727204172.17853: variable 'ansible_connection' from source: unknown 15980 1727204172.17860: variable 'ansible_module_compression' from source: unknown 15980 1727204172.17869: variable 'ansible_shell_type' from source: unknown 15980 1727204172.17876: variable 'ansible_shell_executable' from source: unknown 15980 1727204172.17883: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204172.17934: variable 'ansible_pipelining' from source: unknown 15980 1727204172.17946: variable 'ansible_timeout' from source: unknown 15980 1727204172.17948: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204172.18455: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15980 1727204172.18459: variable 'omit' from source: magic vars 15980 1727204172.18462: starting attempt loop 15980 1727204172.18465: running the handler 15980 1727204172.18470: variable '__network_connections_result' from source: set_fact 15980 1727204172.18499: variable '__network_connections_result' from source: set_fact 15980 1727204172.18764: handler run complete 15980 1727204172.18908: attempt loop complete, returning result 15980 1727204172.18916: _execute() done 15980 1727204172.18923: dumping result to json 15980 1727204172.18935: done dumping result, returning 15980 1727204172.18948: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [127b8e07-fff9-5f1d-4b72-00000000004d] 15980 1727204172.18983: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000004d ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "LSR-TST-br31", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 15980 1727204172.19212: no more pending results, returning what we have 15980 1727204172.19216: results queue empty 15980 1727204172.19217: checking for any_errors_fatal 15980 1727204172.19226: done checking for any_errors_fatal 15980 1727204172.19227: checking for max_fail_percentage 15980 1727204172.19229: done checking for max_fail_percentage 15980 1727204172.19230: checking to see if all hosts have failed and the running result is not ok 15980 1727204172.19231: done checking to see if all hosts have failed 15980 1727204172.19232: getting the remaining hosts for this loop 15980 1727204172.19234: done getting the remaining hosts for this loop 15980 1727204172.19239: getting the next task for host managed-node2 15980 1727204172.19246: done getting next task for host managed-node2 15980 1727204172.19251: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15980 1727204172.19253: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204172.19267: getting variables 15980 1727204172.19269: in VariableManager get_vars() 15980 1727204172.19312: Calling all_inventory to load vars for managed-node2 15980 1727204172.19315: Calling groups_inventory to load vars for managed-node2 15980 1727204172.19317: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204172.19330: Calling all_plugins_play to load vars for managed-node2 15980 1727204172.19333: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204172.19336: Calling groups_plugins_play to load vars for managed-node2 15980 1727204172.19977: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000004d 15980 1727204172.19982: WORKER PROCESS EXITING 15980 1727204172.21650: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204172.26852: done with get_vars() 15980 1727204172.26895: done getting variables 15980 1727204172.27064: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:56:12 -0400 (0:00:00.115) 0:00:33.681 ***** 15980 1727204172.27120: entering _queue_task() for managed-node2/debug 15980 1727204172.27843: worker is 1 (out of 1 available) 15980 1727204172.27858: exiting _queue_task() for managed-node2/debug 15980 1727204172.27882: done queuing things up, now waiting for results queue to drain 15980 1727204172.27888: waiting for pending results... 15980 1727204172.28431: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15980 1727204172.28615: in run() - task 127b8e07-fff9-5f1d-4b72-00000000004e 15980 1727204172.28644: variable 'ansible_search_path' from source: unknown 15980 1727204172.28652: variable 'ansible_search_path' from source: unknown 15980 1727204172.28771: calling self._execute() 15980 1727204172.28837: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204172.28851: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204172.28868: variable 'omit' from source: magic vars 15980 1727204172.29341: variable 'ansible_distribution_major_version' from source: facts 15980 1727204172.29364: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204172.29537: variable 'network_state' from source: role '' defaults 15980 1727204172.29569: Evaluated conditional (network_state != {}): False 15980 1727204172.29622: when evaluation is False, skipping this task 15980 1727204172.29628: _execute() done 15980 1727204172.29631: dumping result to json 15980 1727204172.29634: done dumping result, returning 15980 1727204172.29638: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [127b8e07-fff9-5f1d-4b72-00000000004e] 15980 1727204172.29640: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000004e 15980 1727204172.29810: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000004e 15980 1727204172.29814: WORKER PROCESS EXITING skipping: [managed-node2] => { "false_condition": "network_state != {}" } 15980 1727204172.29884: no more pending results, returning what we have 15980 1727204172.29889: results queue empty 15980 1727204172.29890: checking for any_errors_fatal 15980 1727204172.29897: done checking for any_errors_fatal 15980 1727204172.29898: checking for max_fail_percentage 15980 1727204172.29900: done checking for max_fail_percentage 15980 1727204172.29902: checking to see if all hosts have failed and the running result is not ok 15980 1727204172.29903: done checking to see if all hosts have failed 15980 1727204172.29904: getting the remaining hosts for this loop 15980 1727204172.29905: done getting the remaining hosts for this loop 15980 1727204172.29911: getting the next task for host managed-node2 15980 1727204172.29919: done getting next task for host managed-node2 15980 1727204172.29924: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 15980 1727204172.29930: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204172.29948: getting variables 15980 1727204172.29950: in VariableManager get_vars() 15980 1727204172.30001: Calling all_inventory to load vars for managed-node2 15980 1727204172.30005: Calling groups_inventory to load vars for managed-node2 15980 1727204172.30008: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204172.30024: Calling all_plugins_play to load vars for managed-node2 15980 1727204172.30030: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204172.30035: Calling groups_plugins_play to load vars for managed-node2 15980 1727204172.34159: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204172.36969: done with get_vars() 15980 1727204172.37010: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:56:12 -0400 (0:00:00.101) 0:00:33.783 ***** 15980 1727204172.37304: entering _queue_task() for managed-node2/ping 15980 1727204172.38098: worker is 1 (out of 1 available) 15980 1727204172.38110: exiting _queue_task() for managed-node2/ping 15980 1727204172.38122: done queuing things up, now waiting for results queue to drain 15980 1727204172.38127: waiting for pending results... 15980 1727204172.38272: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 15980 1727204172.38368: in run() - task 127b8e07-fff9-5f1d-4b72-00000000004f 15980 1727204172.38381: variable 'ansible_search_path' from source: unknown 15980 1727204172.38387: variable 'ansible_search_path' from source: unknown 15980 1727204172.38429: calling self._execute() 15980 1727204172.38546: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204172.38551: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204172.38558: variable 'omit' from source: magic vars 15980 1727204172.38948: variable 'ansible_distribution_major_version' from source: facts 15980 1727204172.38985: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204172.38994: variable 'omit' from source: magic vars 15980 1727204172.39023: variable 'omit' from source: magic vars 15980 1727204172.39060: variable 'omit' from source: magic vars 15980 1727204172.39101: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204172.39141: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204172.39157: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204172.39178: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204172.39203: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204172.39233: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204172.39237: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204172.39240: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204172.39329: Set connection var ansible_connection to ssh 15980 1727204172.39334: Set connection var ansible_pipelining to False 15980 1727204172.39337: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204172.39340: Set connection var ansible_timeout to 10 15980 1727204172.39367: Set connection var ansible_shell_type to sh 15980 1727204172.39371: Set connection var ansible_shell_executable to /bin/sh 15980 1727204172.39395: variable 'ansible_shell_executable' from source: unknown 15980 1727204172.39399: variable 'ansible_connection' from source: unknown 15980 1727204172.39402: variable 'ansible_module_compression' from source: unknown 15980 1727204172.39407: variable 'ansible_shell_type' from source: unknown 15980 1727204172.39410: variable 'ansible_shell_executable' from source: unknown 15980 1727204172.39414: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204172.39416: variable 'ansible_pipelining' from source: unknown 15980 1727204172.39419: variable 'ansible_timeout' from source: unknown 15980 1727204172.39421: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204172.39706: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15980 1727204172.39711: variable 'omit' from source: magic vars 15980 1727204172.39714: starting attempt loop 15980 1727204172.39716: running the handler 15980 1727204172.39719: _low_level_execute_command(): starting 15980 1727204172.39721: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15980 1727204172.40863: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204172.40870: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204172.42625: stdout chunk (state=3): >>>/root <<< 15980 1727204172.42791: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204172.42796: stdout chunk (state=3): >>><<< 15980 1727204172.42804: stderr chunk (state=3): >>><<< 15980 1727204172.42826: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204172.42841: _low_level_execute_command(): starting 15980 1727204172.42848: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204172.4282904-18588-36359301954526 `" && echo ansible-tmp-1727204172.4282904-18588-36359301954526="` echo /root/.ansible/tmp/ansible-tmp-1727204172.4282904-18588-36359301954526 `" ) && sleep 0' 15980 1727204172.43419: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204172.43425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204172.43428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204172.43457: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204172.43504: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204172.43583: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204172.45586: stdout chunk (state=3): >>>ansible-tmp-1727204172.4282904-18588-36359301954526=/root/.ansible/tmp/ansible-tmp-1727204172.4282904-18588-36359301954526 <<< 15980 1727204172.45705: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204172.45823: stderr chunk (state=3): >>><<< 15980 1727204172.45829: stdout chunk (state=3): >>><<< 15980 1727204172.45971: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204172.4282904-18588-36359301954526=/root/.ansible/tmp/ansible-tmp-1727204172.4282904-18588-36359301954526 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204172.45975: variable 'ansible_module_compression' from source: unknown 15980 1727204172.45977: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15980vtkmnsww/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 15980 1727204172.46014: variable 'ansible_facts' from source: unknown 15980 1727204172.46112: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204172.4282904-18588-36359301954526/AnsiballZ_ping.py 15980 1727204172.46344: Sending initial data 15980 1727204172.46347: Sent initial data (152 bytes) 15980 1727204172.47046: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204172.47092: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 15980 1727204172.47201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204172.47218: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204172.47239: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204172.47262: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204172.47370: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204172.49005: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15980 1727204172.49103: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15980 1727204172.49208: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15980vtkmnsww/tmptsuouz_b /root/.ansible/tmp/ansible-tmp-1727204172.4282904-18588-36359301954526/AnsiballZ_ping.py <<< 15980 1727204172.49212: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204172.4282904-18588-36359301954526/AnsiballZ_ping.py" <<< 15980 1727204172.49311: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15980vtkmnsww/tmptsuouz_b" to remote "/root/.ansible/tmp/ansible-tmp-1727204172.4282904-18588-36359301954526/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204172.4282904-18588-36359301954526/AnsiballZ_ping.py" <<< 15980 1727204172.50173: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204172.50342: stderr chunk (state=3): >>><<< 15980 1727204172.50346: stdout chunk (state=3): >>><<< 15980 1727204172.50348: done transferring module to remote 15980 1727204172.50357: _low_level_execute_command(): starting 15980 1727204172.50369: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204172.4282904-18588-36359301954526/ /root/.ansible/tmp/ansible-tmp-1727204172.4282904-18588-36359301954526/AnsiballZ_ping.py && sleep 0' 15980 1727204172.51072: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204172.51091: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204172.51116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204172.51137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15980 1727204172.51227: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204172.51264: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204172.51282: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204172.51306: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204172.51429: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204172.53354: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204172.53358: stdout chunk (state=3): >>><<< 15980 1727204172.53360: stderr chunk (state=3): >>><<< 15980 1727204172.53474: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204172.53479: _low_level_execute_command(): starting 15980 1727204172.53482: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204172.4282904-18588-36359301954526/AnsiballZ_ping.py && sleep 0' 15980 1727204172.54116: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204172.54144: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204172.54160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204172.54183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15980 1727204172.54250: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204172.54303: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204172.54323: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204172.54356: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204172.54482: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204172.70955: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 15980 1727204172.72262: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 15980 1727204172.72323: stderr chunk (state=3): >>><<< 15980 1727204172.72331: stdout chunk (state=3): >>><<< 15980 1727204172.72344: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 15980 1727204172.72368: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204172.4282904-18588-36359301954526/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15980 1727204172.72377: _low_level_execute_command(): starting 15980 1727204172.72383: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204172.4282904-18588-36359301954526/ > /dev/null 2>&1 && sleep 0' 15980 1727204172.72861: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204172.72868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 15980 1727204172.72898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204172.72901: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 15980 1727204172.72903: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204172.72906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204172.72957: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204172.72961: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204172.72977: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204172.73051: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204172.74976: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204172.75033: stderr chunk (state=3): >>><<< 15980 1727204172.75037: stdout chunk (state=3): >>><<< 15980 1727204172.75050: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204172.75059: handler run complete 15980 1727204172.75072: attempt loop complete, returning result 15980 1727204172.75075: _execute() done 15980 1727204172.75078: dumping result to json 15980 1727204172.75088: done dumping result, returning 15980 1727204172.75096: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [127b8e07-fff9-5f1d-4b72-00000000004f] 15980 1727204172.75100: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000004f 15980 1727204172.75199: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000004f 15980 1727204172.75202: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 15980 1727204172.75268: no more pending results, returning what we have 15980 1727204172.75271: results queue empty 15980 1727204172.75272: checking for any_errors_fatal 15980 1727204172.75279: done checking for any_errors_fatal 15980 1727204172.75280: checking for max_fail_percentage 15980 1727204172.75281: done checking for max_fail_percentage 15980 1727204172.75282: checking to see if all hosts have failed and the running result is not ok 15980 1727204172.75283: done checking to see if all hosts have failed 15980 1727204172.75284: getting the remaining hosts for this loop 15980 1727204172.75286: done getting the remaining hosts for this loop 15980 1727204172.75290: getting the next task for host managed-node2 15980 1727204172.75298: done getting next task for host managed-node2 15980 1727204172.75300: ^ task is: TASK: meta (role_complete) 15980 1727204172.75302: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204172.75315: getting variables 15980 1727204172.75316: in VariableManager get_vars() 15980 1727204172.75357: Calling all_inventory to load vars for managed-node2 15980 1727204172.75360: Calling groups_inventory to load vars for managed-node2 15980 1727204172.75362: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204172.75380: Calling all_plugins_play to load vars for managed-node2 15980 1727204172.75384: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204172.75387: Calling groups_plugins_play to load vars for managed-node2 15980 1727204172.76536: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204172.77698: done with get_vars() 15980 1727204172.77723: done getting variables 15980 1727204172.77795: done queuing things up, now waiting for results queue to drain 15980 1727204172.77797: results queue empty 15980 1727204172.77798: checking for any_errors_fatal 15980 1727204172.77800: done checking for any_errors_fatal 15980 1727204172.77801: checking for max_fail_percentage 15980 1727204172.77802: done checking for max_fail_percentage 15980 1727204172.77802: checking to see if all hosts have failed and the running result is not ok 15980 1727204172.77803: done checking to see if all hosts have failed 15980 1727204172.77803: getting the remaining hosts for this loop 15980 1727204172.77804: done getting the remaining hosts for this loop 15980 1727204172.77806: getting the next task for host managed-node2 15980 1727204172.77808: done getting next task for host managed-node2 15980 1727204172.77809: ^ task is: TASK: meta (flush_handlers) 15980 1727204172.77810: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204172.77813: getting variables 15980 1727204172.77813: in VariableManager get_vars() 15980 1727204172.77823: Calling all_inventory to load vars for managed-node2 15980 1727204172.77825: Calling groups_inventory to load vars for managed-node2 15980 1727204172.77826: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204172.77830: Calling all_plugins_play to load vars for managed-node2 15980 1727204172.77832: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204172.77834: Calling groups_plugins_play to load vars for managed-node2 15980 1727204172.78675: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204172.79968: done with get_vars() 15980 1727204172.79990: done getting variables 15980 1727204172.80034: in VariableManager get_vars() 15980 1727204172.80046: Calling all_inventory to load vars for managed-node2 15980 1727204172.80048: Calling groups_inventory to load vars for managed-node2 15980 1727204172.80050: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204172.80054: Calling all_plugins_play to load vars for managed-node2 15980 1727204172.80056: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204172.80058: Calling groups_plugins_play to load vars for managed-node2 15980 1727204172.80879: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204172.82044: done with get_vars() 15980 1727204172.82079: done queuing things up, now waiting for results queue to drain 15980 1727204172.82081: results queue empty 15980 1727204172.82082: checking for any_errors_fatal 15980 1727204172.82083: done checking for any_errors_fatal 15980 1727204172.82083: checking for max_fail_percentage 15980 1727204172.82084: done checking for max_fail_percentage 15980 1727204172.82084: checking to see if all hosts have failed and the running result is not ok 15980 1727204172.82085: done checking to see if all hosts have failed 15980 1727204172.82085: getting the remaining hosts for this loop 15980 1727204172.82086: done getting the remaining hosts for this loop 15980 1727204172.82088: getting the next task for host managed-node2 15980 1727204172.82091: done getting next task for host managed-node2 15980 1727204172.82093: ^ task is: TASK: meta (flush_handlers) 15980 1727204172.82094: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204172.82097: getting variables 15980 1727204172.82098: in VariableManager get_vars() 15980 1727204172.82108: Calling all_inventory to load vars for managed-node2 15980 1727204172.82109: Calling groups_inventory to load vars for managed-node2 15980 1727204172.82111: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204172.82116: Calling all_plugins_play to load vars for managed-node2 15980 1727204172.82118: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204172.82120: Calling groups_plugins_play to load vars for managed-node2 15980 1727204172.83007: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204172.84157: done with get_vars() 15980 1727204172.84185: done getting variables 15980 1727204172.84227: in VariableManager get_vars() 15980 1727204172.84237: Calling all_inventory to load vars for managed-node2 15980 1727204172.84239: Calling groups_inventory to load vars for managed-node2 15980 1727204172.84240: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204172.84244: Calling all_plugins_play to load vars for managed-node2 15980 1727204172.84246: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204172.84248: Calling groups_plugins_play to load vars for managed-node2 15980 1727204172.85083: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204172.86341: done with get_vars() 15980 1727204172.86367: done queuing things up, now waiting for results queue to drain 15980 1727204172.86369: results queue empty 15980 1727204172.86370: checking for any_errors_fatal 15980 1727204172.86371: done checking for any_errors_fatal 15980 1727204172.86371: checking for max_fail_percentage 15980 1727204172.86372: done checking for max_fail_percentage 15980 1727204172.86372: checking to see if all hosts have failed and the running result is not ok 15980 1727204172.86373: done checking to see if all hosts have failed 15980 1727204172.86374: getting the remaining hosts for this loop 15980 1727204172.86374: done getting the remaining hosts for this loop 15980 1727204172.86376: getting the next task for host managed-node2 15980 1727204172.86379: done getting next task for host managed-node2 15980 1727204172.86379: ^ task is: None 15980 1727204172.86381: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204172.86381: done queuing things up, now waiting for results queue to drain 15980 1727204172.86382: results queue empty 15980 1727204172.86382: checking for any_errors_fatal 15980 1727204172.86383: done checking for any_errors_fatal 15980 1727204172.86383: checking for max_fail_percentage 15980 1727204172.86384: done checking for max_fail_percentage 15980 1727204172.86384: checking to see if all hosts have failed and the running result is not ok 15980 1727204172.86385: done checking to see if all hosts have failed 15980 1727204172.86386: getting the next task for host managed-node2 15980 1727204172.86387: done getting next task for host managed-node2 15980 1727204172.86388: ^ task is: None 15980 1727204172.86389: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204172.86429: in VariableManager get_vars() 15980 1727204172.86444: done with get_vars() 15980 1727204172.86448: in VariableManager get_vars() 15980 1727204172.86454: done with get_vars() 15980 1727204172.86457: variable 'omit' from source: magic vars 15980 1727204172.86482: in VariableManager get_vars() 15980 1727204172.86489: done with get_vars() 15980 1727204172.86504: variable 'omit' from source: magic vars PLAY [Delete the interface] **************************************************** 15980 1727204172.86638: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15980 1727204172.86661: getting the remaining hosts for this loop 15980 1727204172.86663: done getting the remaining hosts for this loop 15980 1727204172.86665: getting the next task for host managed-node2 15980 1727204172.86669: done getting next task for host managed-node2 15980 1727204172.86670: ^ task is: TASK: Gathering Facts 15980 1727204172.86672: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204172.86673: getting variables 15980 1727204172.86674: in VariableManager get_vars() 15980 1727204172.86680: Calling all_inventory to load vars for managed-node2 15980 1727204172.86681: Calling groups_inventory to load vars for managed-node2 15980 1727204172.86683: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204172.86688: Calling all_plugins_play to load vars for managed-node2 15980 1727204172.86690: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204172.86692: Calling groups_plugins_play to load vars for managed-node2 15980 1727204172.87525: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204172.88661: done with get_vars() 15980 1727204172.88688: done getting variables 15980 1727204172.88729: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 Tuesday 24 September 2024 14:56:12 -0400 (0:00:00.514) 0:00:34.297 ***** 15980 1727204172.88750: entering _queue_task() for managed-node2/gather_facts 15980 1727204172.89028: worker is 1 (out of 1 available) 15980 1727204172.89042: exiting _queue_task() for managed-node2/gather_facts 15980 1727204172.89053: done queuing things up, now waiting for results queue to drain 15980 1727204172.89055: waiting for pending results... 15980 1727204172.89261: running TaskExecutor() for managed-node2/TASK: Gathering Facts 15980 1727204172.89347: in run() - task 127b8e07-fff9-5f1d-4b72-000000000382 15980 1727204172.89360: variable 'ansible_search_path' from source: unknown 15980 1727204172.89396: calling self._execute() 15980 1727204172.89476: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204172.89482: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204172.89490: variable 'omit' from source: magic vars 15980 1727204172.89811: variable 'ansible_distribution_major_version' from source: facts 15980 1727204172.89823: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204172.89833: variable 'omit' from source: magic vars 15980 1727204172.89857: variable 'omit' from source: magic vars 15980 1727204172.89886: variable 'omit' from source: magic vars 15980 1727204172.89923: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204172.89959: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204172.89979: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204172.89995: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204172.90005: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204172.90033: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204172.90037: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204172.90039: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204172.90115: Set connection var ansible_connection to ssh 15980 1727204172.90122: Set connection var ansible_pipelining to False 15980 1727204172.90130: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204172.90136: Set connection var ansible_timeout to 10 15980 1727204172.90142: Set connection var ansible_shell_type to sh 15980 1727204172.90147: Set connection var ansible_shell_executable to /bin/sh 15980 1727204172.90175: variable 'ansible_shell_executable' from source: unknown 15980 1727204172.90179: variable 'ansible_connection' from source: unknown 15980 1727204172.90181: variable 'ansible_module_compression' from source: unknown 15980 1727204172.90184: variable 'ansible_shell_type' from source: unknown 15980 1727204172.90186: variable 'ansible_shell_executable' from source: unknown 15980 1727204172.90191: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204172.90195: variable 'ansible_pipelining' from source: unknown 15980 1727204172.90198: variable 'ansible_timeout' from source: unknown 15980 1727204172.90202: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204172.90355: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15980 1727204172.90364: variable 'omit' from source: magic vars 15980 1727204172.90370: starting attempt loop 15980 1727204172.90373: running the handler 15980 1727204172.90391: variable 'ansible_facts' from source: unknown 15980 1727204172.90408: _low_level_execute_command(): starting 15980 1727204172.90414: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15980 1727204172.90973: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204172.90979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 15980 1727204172.90994: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204172.91070: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204172.91085: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204172.91205: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204172.92936: stdout chunk (state=3): >>>/root <<< 15980 1727204172.93043: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204172.93104: stderr chunk (state=3): >>><<< 15980 1727204172.93108: stdout chunk (state=3): >>><<< 15980 1727204172.93136: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204172.93146: _low_level_execute_command(): starting 15980 1727204172.93155: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204172.9313238-18604-211200851775398 `" && echo ansible-tmp-1727204172.9313238-18604-211200851775398="` echo /root/.ansible/tmp/ansible-tmp-1727204172.9313238-18604-211200851775398 `" ) && sleep 0' 15980 1727204172.93662: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204172.93669: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 15980 1727204172.93673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 15980 1727204172.93683: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15980 1727204172.93688: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204172.93733: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204172.93741: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204172.93747: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204172.93819: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204172.95794: stdout chunk (state=3): >>>ansible-tmp-1727204172.9313238-18604-211200851775398=/root/.ansible/tmp/ansible-tmp-1727204172.9313238-18604-211200851775398 <<< 15980 1727204172.95915: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204172.95978: stderr chunk (state=3): >>><<< 15980 1727204172.95981: stdout chunk (state=3): >>><<< 15980 1727204172.96005: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204172.9313238-18604-211200851775398=/root/.ansible/tmp/ansible-tmp-1727204172.9313238-18604-211200851775398 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204172.96034: variable 'ansible_module_compression' from source: unknown 15980 1727204172.96078: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15980vtkmnsww/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15980 1727204172.96140: variable 'ansible_facts' from source: unknown 15980 1727204172.96274: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204172.9313238-18604-211200851775398/AnsiballZ_setup.py 15980 1727204172.96397: Sending initial data 15980 1727204172.96400: Sent initial data (154 bytes) 15980 1727204172.96905: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15980 1727204172.96911: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204172.96914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204172.96971: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204172.96974: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204172.96983: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204172.97060: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204172.98695: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15980 1727204172.98779: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15980 1727204172.98859: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15980vtkmnsww/tmphk2nx7lp /root/.ansible/tmp/ansible-tmp-1727204172.9313238-18604-211200851775398/AnsiballZ_setup.py <<< 15980 1727204172.98863: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204172.9313238-18604-211200851775398/AnsiballZ_setup.py" <<< 15980 1727204172.98933: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15980vtkmnsww/tmphk2nx7lp" to remote "/root/.ansible/tmp/ansible-tmp-1727204172.9313238-18604-211200851775398/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204172.9313238-18604-211200851775398/AnsiballZ_setup.py" <<< 15980 1727204173.09015: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204173.09096: stderr chunk (state=3): >>><<< 15980 1727204173.09100: stdout chunk (state=3): >>><<< 15980 1727204173.09122: done transferring module to remote 15980 1727204173.09135: _low_level_execute_command(): starting 15980 1727204173.09140: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204172.9313238-18604-211200851775398/ /root/.ansible/tmp/ansible-tmp-1727204172.9313238-18604-211200851775398/AnsiballZ_setup.py && sleep 0' 15980 1727204173.09650: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204173.09653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204173.09657: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 15980 1727204173.09659: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15980 1727204173.09661: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204173.09719: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204173.09723: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204173.09727: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204173.09803: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204173.11773: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204173.11918: stderr chunk (state=3): >>><<< 15980 1727204173.11922: stdout chunk (state=3): >>><<< 15980 1727204173.12012: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204173.12017: _low_level_execute_command(): starting 15980 1727204173.12020: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204172.9313238-18604-211200851775398/AnsiballZ_setup.py && sleep 0' 15980 1727204173.12930: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204173.12986: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 15980 1727204173.13001: stderr chunk (state=3): >>>debug2: match found <<< 15980 1727204173.13076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204173.13101: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204173.13118: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204173.13385: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204173.79220: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_fips": false, "ansible_is_chroot": false, "ansible_loadavg": {"1m": 0.7314453125, "5m": 0.52685546875, "15m": 0.26171875}, "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_local": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3043, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 673, "free": 3043}, "nocache": {"free": 3473, "used": 243}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_vers<<< 15980 1727204173.79238: stdout chunk (state=3): >>>ion": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_uuid": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 519, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251325620224, "block_size": 4096, "block_total": 64479564, "block_available": 61358794, "block_used": 3120770, "inode_total": 16384000, "inode_available": 16301509, "inode_used": 82491, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "13", "epoch": "1727204173", "epoch_int": "1727204173", "date": "2024-09-24", "time": "14:56:13", "iso8601_micro": "2024-09-24T18:56:13.760828Z", "iso8601": "2024-09-24T18:56:13Z", "iso8601_basic": "20240924T145613760828", "iso8601_basic_short": "20240924T145613", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fibre_channel_wwn": [], "ansible_iscsi_iqn": "", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_lsb": {}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::f7:13ff:fe22:8fc1", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.73"], "ansible_all_ipv6_addresses": ["fe80::f7:13ff:fe22:8fc1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.73", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::f7:13ff:fe22:8fc1"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15980 1727204173.81309: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 15980 1727204173.81375: stderr chunk (state=3): >>><<< 15980 1727204173.81378: stdout chunk (state=3): >>><<< 15980 1727204173.81406: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_fips": false, "ansible_is_chroot": false, "ansible_loadavg": {"1m": 0.7314453125, "5m": 0.52685546875, "15m": 0.26171875}, "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_local": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3043, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 673, "free": 3043}, "nocache": {"free": 3473, "used": 243}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_uuid": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 519, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251325620224, "block_size": 4096, "block_total": 64479564, "block_available": 61358794, "block_used": 3120770, "inode_total": 16384000, "inode_available": 16301509, "inode_used": 82491, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "13", "epoch": "1727204173", "epoch_int": "1727204173", "date": "2024-09-24", "time": "14:56:13", "iso8601_micro": "2024-09-24T18:56:13.760828Z", "iso8601": "2024-09-24T18:56:13Z", "iso8601_basic": "20240924T145613760828", "iso8601_basic_short": "20240924T145613", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fibre_channel_wwn": [], "ansible_iscsi_iqn": "", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_lsb": {}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::f7:13ff:fe22:8fc1", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.73"], "ansible_all_ipv6_addresses": ["fe80::f7:13ff:fe22:8fc1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.73", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::f7:13ff:fe22:8fc1"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 15980 1727204173.81613: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204172.9313238-18604-211200851775398/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15980 1727204173.81633: _low_level_execute_command(): starting 15980 1727204173.81637: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204172.9313238-18604-211200851775398/ > /dev/null 2>&1 && sleep 0' 15980 1727204173.82143: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204173.82147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204173.82149: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204173.82152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204173.82209: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204173.82217: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204173.82219: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204173.82288: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204173.84203: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204173.84267: stderr chunk (state=3): >>><<< 15980 1727204173.84272: stdout chunk (state=3): >>><<< 15980 1727204173.84288: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204173.84296: handler run complete 15980 1727204173.84382: variable 'ansible_facts' from source: unknown 15980 1727204173.84468: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204173.84793: variable 'ansible_facts' from source: unknown 15980 1727204173.84858: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204173.84939: attempt loop complete, returning result 15980 1727204173.84943: _execute() done 15980 1727204173.84946: dumping result to json 15980 1727204173.84962: done dumping result, returning 15980 1727204173.84972: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [127b8e07-fff9-5f1d-4b72-000000000382] 15980 1727204173.84976: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000382 15980 1727204173.85208: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000382 15980 1727204173.85212: WORKER PROCESS EXITING ok: [managed-node2] 15980 1727204173.85451: no more pending results, returning what we have 15980 1727204173.85454: results queue empty 15980 1727204173.85455: checking for any_errors_fatal 15980 1727204173.85456: done checking for any_errors_fatal 15980 1727204173.85456: checking for max_fail_percentage 15980 1727204173.85457: done checking for max_fail_percentage 15980 1727204173.85458: checking to see if all hosts have failed and the running result is not ok 15980 1727204173.85458: done checking to see if all hosts have failed 15980 1727204173.85459: getting the remaining hosts for this loop 15980 1727204173.85460: done getting the remaining hosts for this loop 15980 1727204173.85463: getting the next task for host managed-node2 15980 1727204173.85469: done getting next task for host managed-node2 15980 1727204173.85470: ^ task is: TASK: meta (flush_handlers) 15980 1727204173.85472: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204173.85475: getting variables 15980 1727204173.85476: in VariableManager get_vars() 15980 1727204173.85499: Calling all_inventory to load vars for managed-node2 15980 1727204173.85501: Calling groups_inventory to load vars for managed-node2 15980 1727204173.85504: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204173.85513: Calling all_plugins_play to load vars for managed-node2 15980 1727204173.85515: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204173.85517: Calling groups_plugins_play to load vars for managed-node2 15980 1727204173.86539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204173.87724: done with get_vars() 15980 1727204173.87756: done getting variables 15980 1727204173.87818: in VariableManager get_vars() 15980 1727204173.87830: Calling all_inventory to load vars for managed-node2 15980 1727204173.87832: Calling groups_inventory to load vars for managed-node2 15980 1727204173.87834: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204173.87838: Calling all_plugins_play to load vars for managed-node2 15980 1727204173.87839: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204173.87841: Calling groups_plugins_play to load vars for managed-node2 15980 1727204173.88760: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204173.89933: done with get_vars() 15980 1727204173.89970: done queuing things up, now waiting for results queue to drain 15980 1727204173.89972: results queue empty 15980 1727204173.89973: checking for any_errors_fatal 15980 1727204173.89976: done checking for any_errors_fatal 15980 1727204173.89977: checking for max_fail_percentage 15980 1727204173.89977: done checking for max_fail_percentage 15980 1727204173.89983: checking to see if all hosts have failed and the running result is not ok 15980 1727204173.89983: done checking to see if all hosts have failed 15980 1727204173.89984: getting the remaining hosts for this loop 15980 1727204173.89984: done getting the remaining hosts for this loop 15980 1727204173.89986: getting the next task for host managed-node2 15980 1727204173.89989: done getting next task for host managed-node2 15980 1727204173.89991: ^ task is: TASK: Include the task 'delete_interface.yml' 15980 1727204173.89993: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204173.89995: getting variables 15980 1727204173.89995: in VariableManager get_vars() 15980 1727204173.90003: Calling all_inventory to load vars for managed-node2 15980 1727204173.90004: Calling groups_inventory to load vars for managed-node2 15980 1727204173.90006: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204173.90012: Calling all_plugins_play to load vars for managed-node2 15980 1727204173.90014: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204173.90016: Calling groups_plugins_play to load vars for managed-node2 15980 1727204173.90869: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204173.92093: done with get_vars() 15980 1727204173.92115: done getting variables TASK [Include the task 'delete_interface.yml'] ********************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:8 Tuesday 24 September 2024 14:56:13 -0400 (0:00:01.034) 0:00:35.332 ***** 15980 1727204173.92182: entering _queue_task() for managed-node2/include_tasks 15980 1727204173.92478: worker is 1 (out of 1 available) 15980 1727204173.92494: exiting _queue_task() for managed-node2/include_tasks 15980 1727204173.92506: done queuing things up, now waiting for results queue to drain 15980 1727204173.92508: waiting for pending results... 15980 1727204173.92699: running TaskExecutor() for managed-node2/TASK: Include the task 'delete_interface.yml' 15980 1727204173.92790: in run() - task 127b8e07-fff9-5f1d-4b72-000000000052 15980 1727204173.92803: variable 'ansible_search_path' from source: unknown 15980 1727204173.92838: calling self._execute() 15980 1727204173.92915: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204173.92921: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204173.92959: variable 'omit' from source: magic vars 15980 1727204173.93240: variable 'ansible_distribution_major_version' from source: facts 15980 1727204173.93252: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204173.93257: _execute() done 15980 1727204173.93261: dumping result to json 15980 1727204173.93264: done dumping result, returning 15980 1727204173.93272: done running TaskExecutor() for managed-node2/TASK: Include the task 'delete_interface.yml' [127b8e07-fff9-5f1d-4b72-000000000052] 15980 1727204173.93279: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000052 15980 1727204173.93389: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000052 15980 1727204173.93392: WORKER PROCESS EXITING 15980 1727204173.93423: no more pending results, returning what we have 15980 1727204173.93431: in VariableManager get_vars() 15980 1727204173.93472: Calling all_inventory to load vars for managed-node2 15980 1727204173.93475: Calling groups_inventory to load vars for managed-node2 15980 1727204173.93479: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204173.93494: Calling all_plugins_play to load vars for managed-node2 15980 1727204173.93497: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204173.93500: Calling groups_plugins_play to load vars for managed-node2 15980 1727204173.98591: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204173.99745: done with get_vars() 15980 1727204173.99771: variable 'ansible_search_path' from source: unknown 15980 1727204173.99786: we have included files to process 15980 1727204173.99787: generating all_blocks data 15980 1727204173.99787: done generating all_blocks data 15980 1727204173.99788: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 15980 1727204173.99789: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 15980 1727204173.99790: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 15980 1727204173.99950: done processing included file 15980 1727204173.99951: iterating over new_blocks loaded from include file 15980 1727204173.99952: in VariableManager get_vars() 15980 1727204173.99962: done with get_vars() 15980 1727204173.99963: filtering new block on tags 15980 1727204173.99975: done filtering new block on tags 15980 1727204173.99977: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed-node2 15980 1727204173.99981: extending task lists for all hosts with included blocks 15980 1727204174.00003: done extending task lists 15980 1727204174.00004: done processing included files 15980 1727204174.00005: results queue empty 15980 1727204174.00005: checking for any_errors_fatal 15980 1727204174.00006: done checking for any_errors_fatal 15980 1727204174.00007: checking for max_fail_percentage 15980 1727204174.00008: done checking for max_fail_percentage 15980 1727204174.00009: checking to see if all hosts have failed and the running result is not ok 15980 1727204174.00009: done checking to see if all hosts have failed 15980 1727204174.00010: getting the remaining hosts for this loop 15980 1727204174.00011: done getting the remaining hosts for this loop 15980 1727204174.00012: getting the next task for host managed-node2 15980 1727204174.00015: done getting next task for host managed-node2 15980 1727204174.00016: ^ task is: TASK: Remove test interface if necessary 15980 1727204174.00017: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204174.00019: getting variables 15980 1727204174.00020: in VariableManager get_vars() 15980 1727204174.00027: Calling all_inventory to load vars for managed-node2 15980 1727204174.00029: Calling groups_inventory to load vars for managed-node2 15980 1727204174.00031: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204174.00035: Calling all_plugins_play to load vars for managed-node2 15980 1727204174.00037: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204174.00039: Calling groups_plugins_play to load vars for managed-node2 15980 1727204174.00935: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204174.02428: done with get_vars() 15980 1727204174.02458: done getting variables 15980 1727204174.02497: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Tuesday 24 September 2024 14:56:14 -0400 (0:00:00.103) 0:00:35.435 ***** 15980 1727204174.02518: entering _queue_task() for managed-node2/command 15980 1727204174.02821: worker is 1 (out of 1 available) 15980 1727204174.02836: exiting _queue_task() for managed-node2/command 15980 1727204174.02850: done queuing things up, now waiting for results queue to drain 15980 1727204174.02852: waiting for pending results... 15980 1727204174.03042: running TaskExecutor() for managed-node2/TASK: Remove test interface if necessary 15980 1727204174.03130: in run() - task 127b8e07-fff9-5f1d-4b72-000000000393 15980 1727204174.03141: variable 'ansible_search_path' from source: unknown 15980 1727204174.03145: variable 'ansible_search_path' from source: unknown 15980 1727204174.03177: calling self._execute() 15980 1727204174.03264: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204174.03274: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204174.03282: variable 'omit' from source: magic vars 15980 1727204174.03609: variable 'ansible_distribution_major_version' from source: facts 15980 1727204174.03621: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204174.03631: variable 'omit' from source: magic vars 15980 1727204174.03664: variable 'omit' from source: magic vars 15980 1727204174.03736: variable 'interface' from source: set_fact 15980 1727204174.03759: variable 'omit' from source: magic vars 15980 1727204174.03794: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204174.03825: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204174.03848: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204174.03867: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204174.03880: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204174.03905: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204174.03908: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204174.03913: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204174.03992: Set connection var ansible_connection to ssh 15980 1727204174.03999: Set connection var ansible_pipelining to False 15980 1727204174.04006: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204174.04013: Set connection var ansible_timeout to 10 15980 1727204174.04018: Set connection var ansible_shell_type to sh 15980 1727204174.04023: Set connection var ansible_shell_executable to /bin/sh 15980 1727204174.04049: variable 'ansible_shell_executable' from source: unknown 15980 1727204174.04053: variable 'ansible_connection' from source: unknown 15980 1727204174.04057: variable 'ansible_module_compression' from source: unknown 15980 1727204174.04060: variable 'ansible_shell_type' from source: unknown 15980 1727204174.04062: variable 'ansible_shell_executable' from source: unknown 15980 1727204174.04065: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204174.04069: variable 'ansible_pipelining' from source: unknown 15980 1727204174.04072: variable 'ansible_timeout' from source: unknown 15980 1727204174.04074: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204174.04196: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15980 1727204174.04200: variable 'omit' from source: magic vars 15980 1727204174.04205: starting attempt loop 15980 1727204174.04208: running the handler 15980 1727204174.04221: _low_level_execute_command(): starting 15980 1727204174.04230: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15980 1727204174.04803: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204174.04808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15980 1727204174.04812: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 15980 1727204174.04815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204174.04868: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204174.04877: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204174.04949: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204174.06680: stdout chunk (state=3): >>>/root <<< 15980 1727204174.06781: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204174.06851: stderr chunk (state=3): >>><<< 15980 1727204174.06855: stdout chunk (state=3): >>><<< 15980 1727204174.06882: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204174.06893: _low_level_execute_command(): starting 15980 1727204174.06899: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204174.0687952-18635-77704591394868 `" && echo ansible-tmp-1727204174.0687952-18635-77704591394868="` echo /root/.ansible/tmp/ansible-tmp-1727204174.0687952-18635-77704591394868 `" ) && sleep 0' 15980 1727204174.07407: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204174.07411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204174.07421: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 15980 1727204174.07425: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204174.07481: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204174.07485: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204174.07564: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204174.09543: stdout chunk (state=3): >>>ansible-tmp-1727204174.0687952-18635-77704591394868=/root/.ansible/tmp/ansible-tmp-1727204174.0687952-18635-77704591394868 <<< 15980 1727204174.09653: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204174.09717: stderr chunk (state=3): >>><<< 15980 1727204174.09720: stdout chunk (state=3): >>><<< 15980 1727204174.09743: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204174.0687952-18635-77704591394868=/root/.ansible/tmp/ansible-tmp-1727204174.0687952-18635-77704591394868 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204174.09774: variable 'ansible_module_compression' from source: unknown 15980 1727204174.09821: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15980vtkmnsww/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15980 1727204174.09861: variable 'ansible_facts' from source: unknown 15980 1727204174.09914: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204174.0687952-18635-77704591394868/AnsiballZ_command.py 15980 1727204174.10032: Sending initial data 15980 1727204174.10036: Sent initial data (155 bytes) 15980 1727204174.10569: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204174.10573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204174.10576: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204174.10622: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204174.10626: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204174.10639: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204174.10718: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204174.12312: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15980 1727204174.12381: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15980 1727204174.12453: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15980vtkmnsww/tmpfkkl_bv8 /root/.ansible/tmp/ansible-tmp-1727204174.0687952-18635-77704591394868/AnsiballZ_command.py <<< 15980 1727204174.12457: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204174.0687952-18635-77704591394868/AnsiballZ_command.py" <<< 15980 1727204174.12523: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15980vtkmnsww/tmpfkkl_bv8" to remote "/root/.ansible/tmp/ansible-tmp-1727204174.0687952-18635-77704591394868/AnsiballZ_command.py" <<< 15980 1727204174.12526: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204174.0687952-18635-77704591394868/AnsiballZ_command.py" <<< 15980 1727204174.13178: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204174.13259: stderr chunk (state=3): >>><<< 15980 1727204174.13263: stdout chunk (state=3): >>><<< 15980 1727204174.13284: done transferring module to remote 15980 1727204174.13298: _low_level_execute_command(): starting 15980 1727204174.13301: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204174.0687952-18635-77704591394868/ /root/.ansible/tmp/ansible-tmp-1727204174.0687952-18635-77704591394868/AnsiballZ_command.py && sleep 0' 15980 1727204174.13804: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204174.13807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204174.13810: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204174.13812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 15980 1727204174.13819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204174.13872: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204174.13880: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204174.13882: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204174.13951: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204174.15852: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204174.15974: stderr chunk (state=3): >>><<< 15980 1727204174.15978: stdout chunk (state=3): >>><<< 15980 1727204174.16090: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204174.16094: _low_level_execute_command(): starting 15980 1727204174.16097: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204174.0687952-18635-77704591394868/AnsiballZ_command.py && sleep 0' 15980 1727204174.16856: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 15980 1727204174.16956: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204174.16978: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204174.17100: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204174.34282: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"LSR-TST-br31\"", "rc": 1, "cmd": ["ip", "link", "del", "LSR-TST-br31"], "start": "2024-09-24 14:56:14.333639", "end": "2024-09-24 14:56:14.341355", "delta": "0:00:00.007716", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del LSR-TST-br31", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15980 1727204174.35712: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.47.73 closed. <<< 15980 1727204174.35766: stderr chunk (state=3): >>><<< 15980 1727204174.35777: stdout chunk (state=3): >>><<< 15980 1727204174.35791: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"LSR-TST-br31\"", "rc": 1, "cmd": ["ip", "link", "del", "LSR-TST-br31"], "start": "2024-09-24 14:56:14.333639", "end": "2024-09-24 14:56:14.341355", "delta": "0:00:00.007716", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del LSR-TST-br31", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.47.73 closed. 15980 1727204174.35827: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204174.0687952-18635-77704591394868/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15980 1727204174.35837: _low_level_execute_command(): starting 15980 1727204174.35843: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204174.0687952-18635-77704591394868/ > /dev/null 2>&1 && sleep 0' 15980 1727204174.36335: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204174.36341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204174.36344: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204174.36350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204174.36402: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204174.36405: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204174.36408: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204174.36491: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204174.38439: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204174.38499: stderr chunk (state=3): >>><<< 15980 1727204174.38503: stdout chunk (state=3): >>><<< 15980 1727204174.38519: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204174.38528: handler run complete 15980 1727204174.38546: Evaluated conditional (False): False 15980 1727204174.38555: attempt loop complete, returning result 15980 1727204174.38558: _execute() done 15980 1727204174.38561: dumping result to json 15980 1727204174.38568: done dumping result, returning 15980 1727204174.38576: done running TaskExecutor() for managed-node2/TASK: Remove test interface if necessary [127b8e07-fff9-5f1d-4b72-000000000393] 15980 1727204174.38581: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000393 15980 1727204174.38690: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000393 15980 1727204174.38692: WORKER PROCESS EXITING fatal: [managed-node2]: FAILED! => { "changed": false, "cmd": [ "ip", "link", "del", "LSR-TST-br31" ], "delta": "0:00:00.007716", "end": "2024-09-24 14:56:14.341355", "rc": 1, "start": "2024-09-24 14:56:14.333639" } STDERR: Cannot find device "LSR-TST-br31" MSG: non-zero return code ...ignoring 15980 1727204174.38770: no more pending results, returning what we have 15980 1727204174.38773: results queue empty 15980 1727204174.38774: checking for any_errors_fatal 15980 1727204174.38776: done checking for any_errors_fatal 15980 1727204174.38776: checking for max_fail_percentage 15980 1727204174.38778: done checking for max_fail_percentage 15980 1727204174.38779: checking to see if all hosts have failed and the running result is not ok 15980 1727204174.38780: done checking to see if all hosts have failed 15980 1727204174.38781: getting the remaining hosts for this loop 15980 1727204174.38782: done getting the remaining hosts for this loop 15980 1727204174.38786: getting the next task for host managed-node2 15980 1727204174.38795: done getting next task for host managed-node2 15980 1727204174.38797: ^ task is: TASK: meta (flush_handlers) 15980 1727204174.38799: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204174.38806: getting variables 15980 1727204174.38808: in VariableManager get_vars() 15980 1727204174.38841: Calling all_inventory to load vars for managed-node2 15980 1727204174.38844: Calling groups_inventory to load vars for managed-node2 15980 1727204174.38848: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204174.38860: Calling all_plugins_play to load vars for managed-node2 15980 1727204174.38863: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204174.38873: Calling groups_plugins_play to load vars for managed-node2 15980 1727204174.39908: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204174.41112: done with get_vars() 15980 1727204174.41142: done getting variables 15980 1727204174.41204: in VariableManager get_vars() 15980 1727204174.41213: Calling all_inventory to load vars for managed-node2 15980 1727204174.41214: Calling groups_inventory to load vars for managed-node2 15980 1727204174.41216: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204174.41220: Calling all_plugins_play to load vars for managed-node2 15980 1727204174.41222: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204174.41223: Calling groups_plugins_play to load vars for managed-node2 15980 1727204174.42150: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204174.43334: done with get_vars() 15980 1727204174.43364: done queuing things up, now waiting for results queue to drain 15980 1727204174.43368: results queue empty 15980 1727204174.43368: checking for any_errors_fatal 15980 1727204174.43371: done checking for any_errors_fatal 15980 1727204174.43372: checking for max_fail_percentage 15980 1727204174.43373: done checking for max_fail_percentage 15980 1727204174.43373: checking to see if all hosts have failed and the running result is not ok 15980 1727204174.43374: done checking to see if all hosts have failed 15980 1727204174.43374: getting the remaining hosts for this loop 15980 1727204174.43375: done getting the remaining hosts for this loop 15980 1727204174.43377: getting the next task for host managed-node2 15980 1727204174.43380: done getting next task for host managed-node2 15980 1727204174.43381: ^ task is: TASK: meta (flush_handlers) 15980 1727204174.43382: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204174.43384: getting variables 15980 1727204174.43385: in VariableManager get_vars() 15980 1727204174.43392: Calling all_inventory to load vars for managed-node2 15980 1727204174.43394: Calling groups_inventory to load vars for managed-node2 15980 1727204174.43395: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204174.43401: Calling all_plugins_play to load vars for managed-node2 15980 1727204174.43402: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204174.43404: Calling groups_plugins_play to load vars for managed-node2 15980 1727204174.44291: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204174.45452: done with get_vars() 15980 1727204174.45480: done getting variables 15980 1727204174.45522: in VariableManager get_vars() 15980 1727204174.45532: Calling all_inventory to load vars for managed-node2 15980 1727204174.45538: Calling groups_inventory to load vars for managed-node2 15980 1727204174.45540: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204174.45544: Calling all_plugins_play to load vars for managed-node2 15980 1727204174.45546: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204174.45548: Calling groups_plugins_play to load vars for managed-node2 15980 1727204174.46391: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204174.47569: done with get_vars() 15980 1727204174.47600: done queuing things up, now waiting for results queue to drain 15980 1727204174.47601: results queue empty 15980 1727204174.47602: checking for any_errors_fatal 15980 1727204174.47603: done checking for any_errors_fatal 15980 1727204174.47604: checking for max_fail_percentage 15980 1727204174.47605: done checking for max_fail_percentage 15980 1727204174.47605: checking to see if all hosts have failed and the running result is not ok 15980 1727204174.47606: done checking to see if all hosts have failed 15980 1727204174.47606: getting the remaining hosts for this loop 15980 1727204174.47608: done getting the remaining hosts for this loop 15980 1727204174.47610: getting the next task for host managed-node2 15980 1727204174.47613: done getting next task for host managed-node2 15980 1727204174.47614: ^ task is: None 15980 1727204174.47615: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204174.47616: done queuing things up, now waiting for results queue to drain 15980 1727204174.47616: results queue empty 15980 1727204174.47617: checking for any_errors_fatal 15980 1727204174.47617: done checking for any_errors_fatal 15980 1727204174.47618: checking for max_fail_percentage 15980 1727204174.47618: done checking for max_fail_percentage 15980 1727204174.47619: checking to see if all hosts have failed and the running result is not ok 15980 1727204174.47619: done checking to see if all hosts have failed 15980 1727204174.47620: getting the next task for host managed-node2 15980 1727204174.47621: done getting next task for host managed-node2 15980 1727204174.47622: ^ task is: None 15980 1727204174.47623: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204174.47668: in VariableManager get_vars() 15980 1727204174.47686: done with get_vars() 15980 1727204174.47691: in VariableManager get_vars() 15980 1727204174.47700: done with get_vars() 15980 1727204174.47703: variable 'omit' from source: magic vars 15980 1727204174.47804: variable 'profile' from source: play vars 15980 1727204174.47880: in VariableManager get_vars() 15980 1727204174.47892: done with get_vars() 15980 1727204174.47908: variable 'omit' from source: magic vars 15980 1727204174.47955: variable 'profile' from source: play vars PLAY [Remove {{ profile }}] **************************************************** 15980 1727204174.48481: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15980 1727204174.48504: getting the remaining hosts for this loop 15980 1727204174.48505: done getting the remaining hosts for this loop 15980 1727204174.48507: getting the next task for host managed-node2 15980 1727204174.48510: done getting next task for host managed-node2 15980 1727204174.48512: ^ task is: TASK: Gathering Facts 15980 1727204174.48513: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204174.48515: getting variables 15980 1727204174.48516: in VariableManager get_vars() 15980 1727204174.48528: Calling all_inventory to load vars for managed-node2 15980 1727204174.48530: Calling groups_inventory to load vars for managed-node2 15980 1727204174.48531: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204174.48536: Calling all_plugins_play to load vars for managed-node2 15980 1727204174.48538: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204174.48540: Calling groups_plugins_play to load vars for managed-node2 15980 1727204174.49550: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204174.50728: done with get_vars() 15980 1727204174.50754: done getting variables 15980 1727204174.50797: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Tuesday 24 September 2024 14:56:14 -0400 (0:00:00.482) 0:00:35.918 ***** 15980 1727204174.50820: entering _queue_task() for managed-node2/gather_facts 15980 1727204174.51108: worker is 1 (out of 1 available) 15980 1727204174.51121: exiting _queue_task() for managed-node2/gather_facts 15980 1727204174.51134: done queuing things up, now waiting for results queue to drain 15980 1727204174.51137: waiting for pending results... 15980 1727204174.51327: running TaskExecutor() for managed-node2/TASK: Gathering Facts 15980 1727204174.51407: in run() - task 127b8e07-fff9-5f1d-4b72-0000000003a1 15980 1727204174.51421: variable 'ansible_search_path' from source: unknown 15980 1727204174.51456: calling self._execute() 15980 1727204174.51547: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204174.51552: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204174.51561: variable 'omit' from source: magic vars 15980 1727204174.51882: variable 'ansible_distribution_major_version' from source: facts 15980 1727204174.51894: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204174.51900: variable 'omit' from source: magic vars 15980 1727204174.51931: variable 'omit' from source: magic vars 15980 1727204174.51959: variable 'omit' from source: magic vars 15980 1727204174.51995: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204174.52032: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204174.52049: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204174.52067: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204174.52078: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204174.52103: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204174.52106: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204174.52111: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204174.52188: Set connection var ansible_connection to ssh 15980 1727204174.52195: Set connection var ansible_pipelining to False 15980 1727204174.52201: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204174.52207: Set connection var ansible_timeout to 10 15980 1727204174.52212: Set connection var ansible_shell_type to sh 15980 1727204174.52218: Set connection var ansible_shell_executable to /bin/sh 15980 1727204174.52244: variable 'ansible_shell_executable' from source: unknown 15980 1727204174.52248: variable 'ansible_connection' from source: unknown 15980 1727204174.52250: variable 'ansible_module_compression' from source: unknown 15980 1727204174.52253: variable 'ansible_shell_type' from source: unknown 15980 1727204174.52256: variable 'ansible_shell_executable' from source: unknown 15980 1727204174.52258: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204174.52261: variable 'ansible_pipelining' from source: unknown 15980 1727204174.52263: variable 'ansible_timeout' from source: unknown 15980 1727204174.52267: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204174.52414: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15980 1727204174.52423: variable 'omit' from source: magic vars 15980 1727204174.52430: starting attempt loop 15980 1727204174.52433: running the handler 15980 1727204174.52447: variable 'ansible_facts' from source: unknown 15980 1727204174.52464: _low_level_execute_command(): starting 15980 1727204174.52475: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15980 1727204174.53045: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204174.53049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204174.53052: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204174.53054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204174.53111: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204174.53114: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204174.53119: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204174.53194: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204174.54977: stdout chunk (state=3): >>>/root <<< 15980 1727204174.55087: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204174.55145: stderr chunk (state=3): >>><<< 15980 1727204174.55149: stdout chunk (state=3): >>><<< 15980 1727204174.55178: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204174.55187: _low_level_execute_command(): starting 15980 1727204174.55196: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204174.5517442-18651-60835861710355 `" && echo ansible-tmp-1727204174.5517442-18651-60835861710355="` echo /root/.ansible/tmp/ansible-tmp-1727204174.5517442-18651-60835861710355 `" ) && sleep 0' 15980 1727204174.55693: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204174.55696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204174.55710: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204174.55713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204174.55763: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204174.55772: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204174.55775: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204174.55843: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204174.57825: stdout chunk (state=3): >>>ansible-tmp-1727204174.5517442-18651-60835861710355=/root/.ansible/tmp/ansible-tmp-1727204174.5517442-18651-60835861710355 <<< 15980 1727204174.57978: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204174.58040: stderr chunk (state=3): >>><<< 15980 1727204174.58043: stdout chunk (state=3): >>><<< 15980 1727204174.58058: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204174.5517442-18651-60835861710355=/root/.ansible/tmp/ansible-tmp-1727204174.5517442-18651-60835861710355 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204174.58090: variable 'ansible_module_compression' from source: unknown 15980 1727204174.58141: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15980vtkmnsww/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15980 1727204174.58202: variable 'ansible_facts' from source: unknown 15980 1727204174.58477: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204174.5517442-18651-60835861710355/AnsiballZ_setup.py 15980 1727204174.58600: Sending initial data 15980 1727204174.58610: Sent initial data (153 bytes) 15980 1727204174.59506: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204174.59526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204174.59643: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204174.59688: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204174.59815: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204174.61430: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15980 1727204174.61533: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15980 1727204174.61594: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15980vtkmnsww/tmpr22vzuv_ /root/.ansible/tmp/ansible-tmp-1727204174.5517442-18651-60835861710355/AnsiballZ_setup.py <<< 15980 1727204174.61618: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204174.5517442-18651-60835861710355/AnsiballZ_setup.py" <<< 15980 1727204174.61703: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15980vtkmnsww/tmpr22vzuv_" to remote "/root/.ansible/tmp/ansible-tmp-1727204174.5517442-18651-60835861710355/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204174.5517442-18651-60835861710355/AnsiballZ_setup.py" <<< 15980 1727204174.63234: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204174.63308: stderr chunk (state=3): >>><<< 15980 1727204174.63312: stdout chunk (state=3): >>><<< 15980 1727204174.63338: done transferring module to remote 15980 1727204174.63348: _low_level_execute_command(): starting 15980 1727204174.63353: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204174.5517442-18651-60835861710355/ /root/.ansible/tmp/ansible-tmp-1727204174.5517442-18651-60835861710355/AnsiballZ_setup.py && sleep 0' 15980 1727204174.63856: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204174.63860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204174.63863: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204174.63867: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 15980 1727204174.63875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204174.63919: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204174.63923: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204174.63928: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204174.63999: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204174.65973: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204174.65977: stderr chunk (state=3): >>><<< 15980 1727204174.65980: stdout chunk (state=3): >>><<< 15980 1727204174.65983: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204174.65985: _low_level_execute_command(): starting 15980 1727204174.65987: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204174.5517442-18651-60835861710355/AnsiballZ_setup.py && sleep 0' 15980 1727204174.66932: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204174.66953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204174.66981: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204174.67073: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204175.31452: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_fibre_channel_wwn": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "14", "epoch": "1727204174", "epoch_int": "1727204174", "date": "2024-09-24", "time": "14:56:14", "iso8601_micro": "2024-09-24T18:56:14.972782Z", "iso8601": "2024-09-24T18:56:14Z", "iso8601_basic": "20240924T145614972782", "iso8601_basic_short": "20240924T145614", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fips": false, "ansible_pkg_mgr": "dnf", "ansible_is_chroot": false, "ansible_lsb": {}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_loadavg": {"1m": 0.6728515625, "5m": 0.51806640625, "15m": 0.26025390625}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_apparmor": {"status": "disabled"}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::f7:13ff:fe22:8fc1", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.73"], "ansible_all_ipv6_addresses": ["fe80::f7:13ff:fe22:8fc1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.73", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::f7:13ff:fe22:8fc1"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3051, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 665, "free": 3051}, "nocache": {"free": 3481, "used": 235}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_uuid": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 521, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251325599744, "block_size": 4096, "block_total": 64479564, "block_available": 61358789, "block_used": 3120775, "inode_total": 16384000, "inode_available": 16301509, "inode_used": 82491, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_iscsi_iqn": "", "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15980 1727204175.33596: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 15980 1727204175.33661: stderr chunk (state=3): >>><<< 15980 1727204175.33664: stdout chunk (state=3): >>><<< 15980 1727204175.33692: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_fibre_channel_wwn": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "14", "epoch": "1727204174", "epoch_int": "1727204174", "date": "2024-09-24", "time": "14:56:14", "iso8601_micro": "2024-09-24T18:56:14.972782Z", "iso8601": "2024-09-24T18:56:14Z", "iso8601_basic": "20240924T145614972782", "iso8601_basic_short": "20240924T145614", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fips": false, "ansible_pkg_mgr": "dnf", "ansible_is_chroot": false, "ansible_lsb": {}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_loadavg": {"1m": 0.6728515625, "5m": 0.51806640625, "15m": 0.26025390625}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_apparmor": {"status": "disabled"}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::f7:13ff:fe22:8fc1", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.73"], "ansible_all_ipv6_addresses": ["fe80::f7:13ff:fe22:8fc1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.73", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::f7:13ff:fe22:8fc1"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3051, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 665, "free": 3051}, "nocache": {"free": 3481, "used": 235}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_uuid": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 521, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251325599744, "block_size": 4096, "block_total": 64479564, "block_available": 61358789, "block_used": 3120775, "inode_total": 16384000, "inode_available": 16301509, "inode_used": 82491, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_iscsi_iqn": "", "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 15980 1727204175.33911: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204174.5517442-18651-60835861710355/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15980 1727204175.33934: _low_level_execute_command(): starting 15980 1727204175.33939: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204174.5517442-18651-60835861710355/ > /dev/null 2>&1 && sleep 0' 15980 1727204175.34440: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204175.34444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204175.34446: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204175.34449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204175.34510: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204175.34514: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204175.34518: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204175.34595: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204175.36542: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204175.36604: stderr chunk (state=3): >>><<< 15980 1727204175.36608: stdout chunk (state=3): >>><<< 15980 1727204175.36620: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204175.36634: handler run complete 15980 1727204175.36715: variable 'ansible_facts' from source: unknown 15980 1727204175.36794: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204175.36994: variable 'ansible_facts' from source: unknown 15980 1727204175.37049: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204175.37129: attempt loop complete, returning result 15980 1727204175.37139: _execute() done 15980 1727204175.37142: dumping result to json 15980 1727204175.37158: done dumping result, returning 15980 1727204175.37169: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [127b8e07-fff9-5f1d-4b72-0000000003a1] 15980 1727204175.37172: sending task result for task 127b8e07-fff9-5f1d-4b72-0000000003a1 15980 1727204175.37436: done sending task result for task 127b8e07-fff9-5f1d-4b72-0000000003a1 15980 1727204175.37439: WORKER PROCESS EXITING ok: [managed-node2] 15980 1727204175.37668: no more pending results, returning what we have 15980 1727204175.37671: results queue empty 15980 1727204175.37671: checking for any_errors_fatal 15980 1727204175.37672: done checking for any_errors_fatal 15980 1727204175.37673: checking for max_fail_percentage 15980 1727204175.37674: done checking for max_fail_percentage 15980 1727204175.37674: checking to see if all hosts have failed and the running result is not ok 15980 1727204175.37675: done checking to see if all hosts have failed 15980 1727204175.37676: getting the remaining hosts for this loop 15980 1727204175.37676: done getting the remaining hosts for this loop 15980 1727204175.37679: getting the next task for host managed-node2 15980 1727204175.37684: done getting next task for host managed-node2 15980 1727204175.37685: ^ task is: TASK: meta (flush_handlers) 15980 1727204175.37686: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204175.37689: getting variables 15980 1727204175.37690: in VariableManager get_vars() 15980 1727204175.37713: Calling all_inventory to load vars for managed-node2 15980 1727204175.37715: Calling groups_inventory to load vars for managed-node2 15980 1727204175.37717: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204175.37727: Calling all_plugins_play to load vars for managed-node2 15980 1727204175.37729: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204175.37731: Calling groups_plugins_play to load vars for managed-node2 15980 1727204175.38776: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204175.39974: done with get_vars() 15980 1727204175.40002: done getting variables 15980 1727204175.40066: in VariableManager get_vars() 15980 1727204175.40078: Calling all_inventory to load vars for managed-node2 15980 1727204175.40079: Calling groups_inventory to load vars for managed-node2 15980 1727204175.40081: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204175.40085: Calling all_plugins_play to load vars for managed-node2 15980 1727204175.40086: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204175.40088: Calling groups_plugins_play to load vars for managed-node2 15980 1727204175.40938: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204175.42213: done with get_vars() 15980 1727204175.42241: done queuing things up, now waiting for results queue to drain 15980 1727204175.42243: results queue empty 15980 1727204175.42244: checking for any_errors_fatal 15980 1727204175.42247: done checking for any_errors_fatal 15980 1727204175.42247: checking for max_fail_percentage 15980 1727204175.42248: done checking for max_fail_percentage 15980 1727204175.42253: checking to see if all hosts have failed and the running result is not ok 15980 1727204175.42254: done checking to see if all hosts have failed 15980 1727204175.42254: getting the remaining hosts for this loop 15980 1727204175.42255: done getting the remaining hosts for this loop 15980 1727204175.42257: getting the next task for host managed-node2 15980 1727204175.42260: done getting next task for host managed-node2 15980 1727204175.42263: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15980 1727204175.42264: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204175.42274: getting variables 15980 1727204175.42275: in VariableManager get_vars() 15980 1727204175.42287: Calling all_inventory to load vars for managed-node2 15980 1727204175.42288: Calling groups_inventory to load vars for managed-node2 15980 1727204175.42290: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204175.42294: Calling all_plugins_play to load vars for managed-node2 15980 1727204175.42296: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204175.42297: Calling groups_plugins_play to load vars for managed-node2 15980 1727204175.43143: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204175.44306: done with get_vars() 15980 1727204175.44333: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:56:15 -0400 (0:00:00.935) 0:00:36.854 ***** 15980 1727204175.44400: entering _queue_task() for managed-node2/include_tasks 15980 1727204175.44694: worker is 1 (out of 1 available) 15980 1727204175.44709: exiting _queue_task() for managed-node2/include_tasks 15980 1727204175.44723: done queuing things up, now waiting for results queue to drain 15980 1727204175.44728: waiting for pending results... 15980 1727204175.44923: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15980 1727204175.45005: in run() - task 127b8e07-fff9-5f1d-4b72-00000000005a 15980 1727204175.45020: variable 'ansible_search_path' from source: unknown 15980 1727204175.45023: variable 'ansible_search_path' from source: unknown 15980 1727204175.45055: calling self._execute() 15980 1727204175.45144: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204175.45148: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204175.45157: variable 'omit' from source: magic vars 15980 1727204175.45462: variable 'ansible_distribution_major_version' from source: facts 15980 1727204175.45474: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204175.45481: _execute() done 15980 1727204175.45484: dumping result to json 15980 1727204175.45486: done dumping result, returning 15980 1727204175.45496: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [127b8e07-fff9-5f1d-4b72-00000000005a] 15980 1727204175.45499: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000005a 15980 1727204175.45611: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000005a 15980 1727204175.45614: WORKER PROCESS EXITING 15980 1727204175.45661: no more pending results, returning what we have 15980 1727204175.45668: in VariableManager get_vars() 15980 1727204175.45714: Calling all_inventory to load vars for managed-node2 15980 1727204175.45717: Calling groups_inventory to load vars for managed-node2 15980 1727204175.45719: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204175.45736: Calling all_plugins_play to load vars for managed-node2 15980 1727204175.45739: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204175.45743: Calling groups_plugins_play to load vars for managed-node2 15980 1727204175.46841: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204175.48035: done with get_vars() 15980 1727204175.48060: variable 'ansible_search_path' from source: unknown 15980 1727204175.48061: variable 'ansible_search_path' from source: unknown 15980 1727204175.48089: we have included files to process 15980 1727204175.48089: generating all_blocks data 15980 1727204175.48091: done generating all_blocks data 15980 1727204175.48091: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15980 1727204175.48092: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15980 1727204175.48094: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15980 1727204175.48523: done processing included file 15980 1727204175.48527: iterating over new_blocks loaded from include file 15980 1727204175.48529: in VariableManager get_vars() 15980 1727204175.48546: done with get_vars() 15980 1727204175.48547: filtering new block on tags 15980 1727204175.48561: done filtering new block on tags 15980 1727204175.48563: in VariableManager get_vars() 15980 1727204175.48580: done with get_vars() 15980 1727204175.48582: filtering new block on tags 15980 1727204175.48594: done filtering new block on tags 15980 1727204175.48596: in VariableManager get_vars() 15980 1727204175.48609: done with get_vars() 15980 1727204175.48610: filtering new block on tags 15980 1727204175.48620: done filtering new block on tags 15980 1727204175.48621: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 15980 1727204175.48627: extending task lists for all hosts with included blocks 15980 1727204175.48883: done extending task lists 15980 1727204175.48884: done processing included files 15980 1727204175.48885: results queue empty 15980 1727204175.48885: checking for any_errors_fatal 15980 1727204175.48886: done checking for any_errors_fatal 15980 1727204175.48887: checking for max_fail_percentage 15980 1727204175.48888: done checking for max_fail_percentage 15980 1727204175.48888: checking to see if all hosts have failed and the running result is not ok 15980 1727204175.48889: done checking to see if all hosts have failed 15980 1727204175.48889: getting the remaining hosts for this loop 15980 1727204175.48890: done getting the remaining hosts for this loop 15980 1727204175.48892: getting the next task for host managed-node2 15980 1727204175.48896: done getting next task for host managed-node2 15980 1727204175.48898: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15980 1727204175.48899: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204175.48908: getting variables 15980 1727204175.48908: in VariableManager get_vars() 15980 1727204175.48919: Calling all_inventory to load vars for managed-node2 15980 1727204175.48921: Calling groups_inventory to load vars for managed-node2 15980 1727204175.48922: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204175.48928: Calling all_plugins_play to load vars for managed-node2 15980 1727204175.48930: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204175.48932: Calling groups_plugins_play to load vars for managed-node2 15980 1727204175.49892: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204175.51074: done with get_vars() 15980 1727204175.51101: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:56:15 -0400 (0:00:00.067) 0:00:36.922 ***** 15980 1727204175.51172: entering _queue_task() for managed-node2/setup 15980 1727204175.51478: worker is 1 (out of 1 available) 15980 1727204175.51491: exiting _queue_task() for managed-node2/setup 15980 1727204175.51505: done queuing things up, now waiting for results queue to drain 15980 1727204175.51507: waiting for pending results... 15980 1727204175.51701: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15980 1727204175.51799: in run() - task 127b8e07-fff9-5f1d-4b72-0000000003e2 15980 1727204175.51812: variable 'ansible_search_path' from source: unknown 15980 1727204175.51817: variable 'ansible_search_path' from source: unknown 15980 1727204175.51852: calling self._execute() 15980 1727204175.51937: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204175.51942: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204175.51951: variable 'omit' from source: magic vars 15980 1727204175.52272: variable 'ansible_distribution_major_version' from source: facts 15980 1727204175.52284: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204175.52458: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15980 1727204175.54452: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15980 1727204175.54509: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15980 1727204175.54539: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15980 1727204175.54571: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15980 1727204175.54592: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15980 1727204175.54661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204175.54688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204175.54771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204175.54775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204175.54799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204175.54871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204175.54892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204175.54913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204175.54954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204175.55174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204175.55196: variable '__network_required_facts' from source: role '' defaults 15980 1727204175.55216: variable 'ansible_facts' from source: unknown 15980 1727204175.55994: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 15980 1727204175.55998: when evaluation is False, skipping this task 15980 1727204175.56001: _execute() done 15980 1727204175.56006: dumping result to json 15980 1727204175.56009: done dumping result, returning 15980 1727204175.56012: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [127b8e07-fff9-5f1d-4b72-0000000003e2] 15980 1727204175.56022: sending task result for task 127b8e07-fff9-5f1d-4b72-0000000003e2 15980 1727204175.56120: done sending task result for task 127b8e07-fff9-5f1d-4b72-0000000003e2 15980 1727204175.56126: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15980 1727204175.56185: no more pending results, returning what we have 15980 1727204175.56189: results queue empty 15980 1727204175.56191: checking for any_errors_fatal 15980 1727204175.56192: done checking for any_errors_fatal 15980 1727204175.56193: checking for max_fail_percentage 15980 1727204175.56194: done checking for max_fail_percentage 15980 1727204175.56195: checking to see if all hosts have failed and the running result is not ok 15980 1727204175.56196: done checking to see if all hosts have failed 15980 1727204175.56197: getting the remaining hosts for this loop 15980 1727204175.56199: done getting the remaining hosts for this loop 15980 1727204175.56203: getting the next task for host managed-node2 15980 1727204175.56212: done getting next task for host managed-node2 15980 1727204175.56216: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 15980 1727204175.56219: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204175.56242: getting variables 15980 1727204175.56244: in VariableManager get_vars() 15980 1727204175.56288: Calling all_inventory to load vars for managed-node2 15980 1727204175.56291: Calling groups_inventory to load vars for managed-node2 15980 1727204175.56293: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204175.56303: Calling all_plugins_play to load vars for managed-node2 15980 1727204175.56306: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204175.56309: Calling groups_plugins_play to load vars for managed-node2 15980 1727204175.58041: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204175.60177: done with get_vars() 15980 1727204175.60217: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:56:15 -0400 (0:00:00.091) 0:00:37.013 ***** 15980 1727204175.60325: entering _queue_task() for managed-node2/stat 15980 1727204175.60719: worker is 1 (out of 1 available) 15980 1727204175.60733: exiting _queue_task() for managed-node2/stat 15980 1727204175.60747: done queuing things up, now waiting for results queue to drain 15980 1727204175.60749: waiting for pending results... 15980 1727204175.61190: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 15980 1727204175.61226: in run() - task 127b8e07-fff9-5f1d-4b72-0000000003e4 15980 1727204175.61251: variable 'ansible_search_path' from source: unknown 15980 1727204175.61259: variable 'ansible_search_path' from source: unknown 15980 1727204175.61312: calling self._execute() 15980 1727204175.61436: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204175.61501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204175.61504: variable 'omit' from source: magic vars 15980 1727204175.61895: variable 'ansible_distribution_major_version' from source: facts 15980 1727204175.61914: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204175.62115: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15980 1727204175.62343: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15980 1727204175.62383: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15980 1727204175.62411: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15980 1727204175.62438: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15980 1727204175.62512: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15980 1727204175.62532: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15980 1727204175.62552: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204175.62573: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15980 1727204175.62643: variable '__network_is_ostree' from source: set_fact 15980 1727204175.62650: Evaluated conditional (not __network_is_ostree is defined): False 15980 1727204175.62653: when evaluation is False, skipping this task 15980 1727204175.62656: _execute() done 15980 1727204175.62658: dumping result to json 15980 1727204175.62662: done dumping result, returning 15980 1727204175.62672: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [127b8e07-fff9-5f1d-4b72-0000000003e4] 15980 1727204175.62677: sending task result for task 127b8e07-fff9-5f1d-4b72-0000000003e4 15980 1727204175.62774: done sending task result for task 127b8e07-fff9-5f1d-4b72-0000000003e4 15980 1727204175.62776: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15980 1727204175.62846: no more pending results, returning what we have 15980 1727204175.62850: results queue empty 15980 1727204175.62851: checking for any_errors_fatal 15980 1727204175.62859: done checking for any_errors_fatal 15980 1727204175.62859: checking for max_fail_percentage 15980 1727204175.62861: done checking for max_fail_percentage 15980 1727204175.62862: checking to see if all hosts have failed and the running result is not ok 15980 1727204175.62863: done checking to see if all hosts have failed 15980 1727204175.62864: getting the remaining hosts for this loop 15980 1727204175.62867: done getting the remaining hosts for this loop 15980 1727204175.62871: getting the next task for host managed-node2 15980 1727204175.62877: done getting next task for host managed-node2 15980 1727204175.62881: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15980 1727204175.62884: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204175.62898: getting variables 15980 1727204175.62900: in VariableManager get_vars() 15980 1727204175.62940: Calling all_inventory to load vars for managed-node2 15980 1727204175.62943: Calling groups_inventory to load vars for managed-node2 15980 1727204175.62945: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204175.62955: Calling all_plugins_play to load vars for managed-node2 15980 1727204175.62958: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204175.62960: Calling groups_plugins_play to load vars for managed-node2 15980 1727204175.64095: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204175.66303: done with get_vars() 15980 1727204175.66339: done getting variables 15980 1727204175.66413: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:56:15 -0400 (0:00:00.061) 0:00:37.074 ***** 15980 1727204175.66451: entering _queue_task() for managed-node2/set_fact 15980 1727204175.67154: worker is 1 (out of 1 available) 15980 1727204175.67172: exiting _queue_task() for managed-node2/set_fact 15980 1727204175.67187: done queuing things up, now waiting for results queue to drain 15980 1727204175.67189: waiting for pending results... 15980 1727204175.67838: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15980 1727204175.67976: in run() - task 127b8e07-fff9-5f1d-4b72-0000000003e5 15980 1727204175.68001: variable 'ansible_search_path' from source: unknown 15980 1727204175.68010: variable 'ansible_search_path' from source: unknown 15980 1727204175.68063: calling self._execute() 15980 1727204175.68188: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204175.68201: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204175.68273: variable 'omit' from source: magic vars 15980 1727204175.68646: variable 'ansible_distribution_major_version' from source: facts 15980 1727204175.68670: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204175.68881: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15980 1727204175.69212: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15980 1727204175.69278: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15980 1727204175.69323: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15980 1727204175.69374: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15980 1727204175.69529: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15980 1727204175.69533: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15980 1727204175.69561: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204175.69604: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15980 1727204175.69734: variable '__network_is_ostree' from source: set_fact 15980 1727204175.69775: Evaluated conditional (not __network_is_ostree is defined): False 15980 1727204175.69778: when evaluation is False, skipping this task 15980 1727204175.69784: _execute() done 15980 1727204175.69800: dumping result to json 15980 1727204175.69883: done dumping result, returning 15980 1727204175.69887: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [127b8e07-fff9-5f1d-4b72-0000000003e5] 15980 1727204175.69890: sending task result for task 127b8e07-fff9-5f1d-4b72-0000000003e5 15980 1727204175.70096: done sending task result for task 127b8e07-fff9-5f1d-4b72-0000000003e5 15980 1727204175.70099: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15980 1727204175.70190: no more pending results, returning what we have 15980 1727204175.70194: results queue empty 15980 1727204175.70195: checking for any_errors_fatal 15980 1727204175.70203: done checking for any_errors_fatal 15980 1727204175.70204: checking for max_fail_percentage 15980 1727204175.70205: done checking for max_fail_percentage 15980 1727204175.70206: checking to see if all hosts have failed and the running result is not ok 15980 1727204175.70208: done checking to see if all hosts have failed 15980 1727204175.70209: getting the remaining hosts for this loop 15980 1727204175.70211: done getting the remaining hosts for this loop 15980 1727204175.70215: getting the next task for host managed-node2 15980 1727204175.70228: done getting next task for host managed-node2 15980 1727204175.70232: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 15980 1727204175.70235: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204175.70254: getting variables 15980 1727204175.70256: in VariableManager get_vars() 15980 1727204175.70302: Calling all_inventory to load vars for managed-node2 15980 1727204175.70305: Calling groups_inventory to load vars for managed-node2 15980 1727204175.70308: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204175.70320: Calling all_plugins_play to load vars for managed-node2 15980 1727204175.70323: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204175.70329: Calling groups_plugins_play to load vars for managed-node2 15980 1727204175.74550: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204175.77290: done with get_vars() 15980 1727204175.77333: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:56:15 -0400 (0:00:00.109) 0:00:37.184 ***** 15980 1727204175.77447: entering _queue_task() for managed-node2/service_facts 15980 1727204175.77891: worker is 1 (out of 1 available) 15980 1727204175.77907: exiting _queue_task() for managed-node2/service_facts 15980 1727204175.77919: done queuing things up, now waiting for results queue to drain 15980 1727204175.77921: waiting for pending results... 15980 1727204175.78291: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 15980 1727204175.78340: in run() - task 127b8e07-fff9-5f1d-4b72-0000000003e7 15980 1727204175.78362: variable 'ansible_search_path' from source: unknown 15980 1727204175.78372: variable 'ansible_search_path' from source: unknown 15980 1727204175.78423: calling self._execute() 15980 1727204175.78552: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204175.78607: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204175.78622: variable 'omit' from source: magic vars 15980 1727204175.79155: variable 'ansible_distribution_major_version' from source: facts 15980 1727204175.79161: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204175.79163: variable 'omit' from source: magic vars 15980 1727204175.79186: variable 'omit' from source: magic vars 15980 1727204175.79238: variable 'omit' from source: magic vars 15980 1727204175.79297: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204175.79372: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204175.79376: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204175.79400: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204175.79417: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204175.79452: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204175.79461: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204175.79485: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204175.79702: Set connection var ansible_connection to ssh 15980 1727204175.79705: Set connection var ansible_pipelining to False 15980 1727204175.79708: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204175.79712: Set connection var ansible_timeout to 10 15980 1727204175.79715: Set connection var ansible_shell_type to sh 15980 1727204175.79717: Set connection var ansible_shell_executable to /bin/sh 15980 1727204175.79719: variable 'ansible_shell_executable' from source: unknown 15980 1727204175.79721: variable 'ansible_connection' from source: unknown 15980 1727204175.79724: variable 'ansible_module_compression' from source: unknown 15980 1727204175.79725: variable 'ansible_shell_type' from source: unknown 15980 1727204175.79727: variable 'ansible_shell_executable' from source: unknown 15980 1727204175.79730: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204175.79732: variable 'ansible_pipelining' from source: unknown 15980 1727204175.79734: variable 'ansible_timeout' from source: unknown 15980 1727204175.79811: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204175.79984: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15980 1727204175.80004: variable 'omit' from source: magic vars 15980 1727204175.80016: starting attempt loop 15980 1727204175.80028: running the handler 15980 1727204175.80050: _low_level_execute_command(): starting 15980 1727204175.80066: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15980 1727204175.80899: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204175.80918: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204175.81054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204175.81067: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204175.81405: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204175.81409: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204175.83171: stdout chunk (state=3): >>>/root <<< 15980 1727204175.83373: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204175.83390: stdout chunk (state=3): >>><<< 15980 1727204175.83673: stderr chunk (state=3): >>><<< 15980 1727204175.83678: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204175.83682: _low_level_execute_command(): starting 15980 1727204175.83686: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204175.8349524-18682-185881390012837 `" && echo ansible-tmp-1727204175.8349524-18682-185881390012837="` echo /root/.ansible/tmp/ansible-tmp-1727204175.8349524-18682-185881390012837 `" ) && sleep 0' 15980 1727204175.84942: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204175.84948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204175.84961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 15980 1727204175.85013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204175.85057: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204175.85173: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204175.85280: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204175.87287: stdout chunk (state=3): >>>ansible-tmp-1727204175.8349524-18682-185881390012837=/root/.ansible/tmp/ansible-tmp-1727204175.8349524-18682-185881390012837 <<< 15980 1727204175.87501: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204175.87505: stdout chunk (state=3): >>><<< 15980 1727204175.87508: stderr chunk (state=3): >>><<< 15980 1727204175.87524: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204175.8349524-18682-185881390012837=/root/.ansible/tmp/ansible-tmp-1727204175.8349524-18682-185881390012837 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204175.87673: variable 'ansible_module_compression' from source: unknown 15980 1727204175.87677: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15980vtkmnsww/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 15980 1727204175.87700: variable 'ansible_facts' from source: unknown 15980 1727204175.87797: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204175.8349524-18682-185881390012837/AnsiballZ_service_facts.py 15980 1727204175.87990: Sending initial data 15980 1727204175.87999: Sent initial data (162 bytes) 15980 1727204175.88662: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204175.88887: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204175.89121: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204175.89125: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204175.89209: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204175.90840: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15980 1727204175.90910: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15980 1727204175.91009: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15980vtkmnsww/tmpaka9v3wd /root/.ansible/tmp/ansible-tmp-1727204175.8349524-18682-185881390012837/AnsiballZ_service_facts.py <<< 15980 1727204175.91013: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204175.8349524-18682-185881390012837/AnsiballZ_service_facts.py" <<< 15980 1727204175.91076: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15980vtkmnsww/tmpaka9v3wd" to remote "/root/.ansible/tmp/ansible-tmp-1727204175.8349524-18682-185881390012837/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204175.8349524-18682-185881390012837/AnsiballZ_service_facts.py" <<< 15980 1727204175.92664: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204175.92673: stdout chunk (state=3): >>><<< 15980 1727204175.92676: stderr chunk (state=3): >>><<< 15980 1727204175.92678: done transferring module to remote 15980 1727204175.92685: _low_level_execute_command(): starting 15980 1727204175.92688: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204175.8349524-18682-185881390012837/ /root/.ansible/tmp/ansible-tmp-1727204175.8349524-18682-185881390012837/AnsiballZ_service_facts.py && sleep 0' 15980 1727204175.93703: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204175.93789: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204175.93899: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204175.95742: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204175.95864: stderr chunk (state=3): >>><<< 15980 1727204175.95880: stdout chunk (state=3): >>><<< 15980 1727204175.95899: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204175.95906: _low_level_execute_command(): starting 15980 1727204175.95914: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204175.8349524-18682-185881390012837/AnsiballZ_service_facts.py && sleep 0' 15980 1727204175.96832: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204175.96852: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204175.96879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204175.96903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15980 1727204175.96936: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 15980 1727204175.96967: stderr chunk (state=3): >>>debug2: match not found <<< 15980 1727204175.96984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204175.97003: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15980 1727204175.97018: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 15980 1727204175.97117: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204175.97152: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204175.97372: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204178.14188: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 15980 1727204178.15753: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 15980 1727204178.15757: stdout chunk (state=3): >>><<< 15980 1727204178.15760: stderr chunk (state=3): >>><<< 15980 1727204178.15945: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 15980 1727204178.17208: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204175.8349524-18682-185881390012837/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15980 1727204178.17233: _low_level_execute_command(): starting 15980 1727204178.17250: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204175.8349524-18682-185881390012837/ > /dev/null 2>&1 && sleep 0' 15980 1727204178.17977: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204178.17998: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204178.18118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204178.18137: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204178.18251: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204178.20250: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204178.20268: stdout chunk (state=3): >>><<< 15980 1727204178.20286: stderr chunk (state=3): >>><<< 15980 1727204178.20306: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204178.20319: handler run complete 15980 1727204178.20589: variable 'ansible_facts' from source: unknown 15980 1727204178.20872: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204178.21466: variable 'ansible_facts' from source: unknown 15980 1727204178.21640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204178.21924: attempt loop complete, returning result 15980 1727204178.21938: _execute() done 15980 1727204178.21946: dumping result to json 15980 1727204178.22054: done dumping result, returning 15980 1727204178.22126: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [127b8e07-fff9-5f1d-4b72-0000000003e7] 15980 1727204178.22129: sending task result for task 127b8e07-fff9-5f1d-4b72-0000000003e7 15980 1727204178.23999: done sending task result for task 127b8e07-fff9-5f1d-4b72-0000000003e7 15980 1727204178.24008: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15980 1727204178.24136: no more pending results, returning what we have 15980 1727204178.24206: results queue empty 15980 1727204178.24207: checking for any_errors_fatal 15980 1727204178.24213: done checking for any_errors_fatal 15980 1727204178.24214: checking for max_fail_percentage 15980 1727204178.24215: done checking for max_fail_percentage 15980 1727204178.24216: checking to see if all hosts have failed and the running result is not ok 15980 1727204178.24217: done checking to see if all hosts have failed 15980 1727204178.24218: getting the remaining hosts for this loop 15980 1727204178.24220: done getting the remaining hosts for this loop 15980 1727204178.24223: getting the next task for host managed-node2 15980 1727204178.24230: done getting next task for host managed-node2 15980 1727204178.24237: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 15980 1727204178.24240: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204178.24347: getting variables 15980 1727204178.24350: in VariableManager get_vars() 15980 1727204178.24442: Calling all_inventory to load vars for managed-node2 15980 1727204178.24456: Calling groups_inventory to load vars for managed-node2 15980 1727204178.24460: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204178.24485: Calling all_plugins_play to load vars for managed-node2 15980 1727204178.24510: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204178.24516: Calling groups_plugins_play to load vars for managed-node2 15980 1727204178.31317: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204178.33527: done with get_vars() 15980 1727204178.33594: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:56:18 -0400 (0:00:02.563) 0:00:39.748 ***** 15980 1727204178.33828: entering _queue_task() for managed-node2/package_facts 15980 1727204178.34224: worker is 1 (out of 1 available) 15980 1727204178.34239: exiting _queue_task() for managed-node2/package_facts 15980 1727204178.34253: done queuing things up, now waiting for results queue to drain 15980 1727204178.34255: waiting for pending results... 15980 1727204178.34687: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 15980 1727204178.34745: in run() - task 127b8e07-fff9-5f1d-4b72-0000000003e8 15980 1727204178.34770: variable 'ansible_search_path' from source: unknown 15980 1727204178.34872: variable 'ansible_search_path' from source: unknown 15980 1727204178.34879: calling self._execute() 15980 1727204178.34971: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204178.34985: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204178.35000: variable 'omit' from source: magic vars 15980 1727204178.35437: variable 'ansible_distribution_major_version' from source: facts 15980 1727204178.35461: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204178.35475: variable 'omit' from source: magic vars 15980 1727204178.35542: variable 'omit' from source: magic vars 15980 1727204178.35662: variable 'omit' from source: magic vars 15980 1727204178.35665: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204178.35697: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204178.35724: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204178.35748: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204178.35770: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204178.35809: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204178.35818: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204178.35826: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204178.35940: Set connection var ansible_connection to ssh 15980 1727204178.35956: Set connection var ansible_pipelining to False 15980 1727204178.35968: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204178.35981: Set connection var ansible_timeout to 10 15980 1727204178.35996: Set connection var ansible_shell_type to sh 15980 1727204178.36005: Set connection var ansible_shell_executable to /bin/sh 15980 1727204178.36070: variable 'ansible_shell_executable' from source: unknown 15980 1727204178.36073: variable 'ansible_connection' from source: unknown 15980 1727204178.36076: variable 'ansible_module_compression' from source: unknown 15980 1727204178.36078: variable 'ansible_shell_type' from source: unknown 15980 1727204178.36080: variable 'ansible_shell_executable' from source: unknown 15980 1727204178.36082: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204178.36084: variable 'ansible_pipelining' from source: unknown 15980 1727204178.36086: variable 'ansible_timeout' from source: unknown 15980 1727204178.36088: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204178.36311: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15980 1727204178.36472: variable 'omit' from source: magic vars 15980 1727204178.36476: starting attempt loop 15980 1727204178.36478: running the handler 15980 1727204178.36481: _low_level_execute_command(): starting 15980 1727204178.36484: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15980 1727204178.37194: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204178.37273: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204178.37333: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204178.37356: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204178.37387: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204178.37496: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204178.39241: stdout chunk (state=3): >>>/root <<< 15980 1727204178.39411: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204178.39490: stderr chunk (state=3): >>><<< 15980 1727204178.39494: stdout chunk (state=3): >>><<< 15980 1727204178.39519: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204178.39571: _low_level_execute_command(): starting 15980 1727204178.39576: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204178.3952875-18996-172765155783845 `" && echo ansible-tmp-1727204178.3952875-18996-172765155783845="` echo /root/.ansible/tmp/ansible-tmp-1727204178.3952875-18996-172765155783845 `" ) && sleep 0' 15980 1727204178.40237: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204178.40257: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204178.40278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204178.40297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15980 1727204178.40404: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204178.40436: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204178.40546: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204178.42540: stdout chunk (state=3): >>>ansible-tmp-1727204178.3952875-18996-172765155783845=/root/.ansible/tmp/ansible-tmp-1727204178.3952875-18996-172765155783845 <<< 15980 1727204178.42701: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204178.42771: stderr chunk (state=3): >>><<< 15980 1727204178.42781: stdout chunk (state=3): >>><<< 15980 1727204178.42816: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204178.3952875-18996-172765155783845=/root/.ansible/tmp/ansible-tmp-1727204178.3952875-18996-172765155783845 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204178.42971: variable 'ansible_module_compression' from source: unknown 15980 1727204178.42975: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15980vtkmnsww/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 15980 1727204178.43015: variable 'ansible_facts' from source: unknown 15980 1727204178.43221: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204178.3952875-18996-172765155783845/AnsiballZ_package_facts.py 15980 1727204178.43447: Sending initial data 15980 1727204178.43450: Sent initial data (162 bytes) 15980 1727204178.44238: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204178.44285: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 15980 1727204178.44387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204178.44402: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204178.44438: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204178.44541: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204178.46159: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 15980 1727204178.46199: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15980 1727204178.46268: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15980 1727204178.46349: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15980vtkmnsww/tmpcc5zy7vw /root/.ansible/tmp/ansible-tmp-1727204178.3952875-18996-172765155783845/AnsiballZ_package_facts.py <<< 15980 1727204178.46353: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204178.3952875-18996-172765155783845/AnsiballZ_package_facts.py" <<< 15980 1727204178.46442: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15980vtkmnsww/tmpcc5zy7vw" to remote "/root/.ansible/tmp/ansible-tmp-1727204178.3952875-18996-172765155783845/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204178.3952875-18996-172765155783845/AnsiballZ_package_facts.py" <<< 15980 1727204178.48229: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204178.48483: stderr chunk (state=3): >>><<< 15980 1727204178.48487: stdout chunk (state=3): >>><<< 15980 1727204178.48490: done transferring module to remote 15980 1727204178.48493: _low_level_execute_command(): starting 15980 1727204178.48495: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204178.3952875-18996-172765155783845/ /root/.ansible/tmp/ansible-tmp-1727204178.3952875-18996-172765155783845/AnsiballZ_package_facts.py && sleep 0' 15980 1727204178.49120: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204178.49139: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204178.49185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204178.49201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15980 1727204178.49216: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 15980 1727204178.49227: stderr chunk (state=3): >>>debug2: match found <<< 15980 1727204178.49282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204178.49321: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204178.49345: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204178.49372: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204178.49477: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204178.51400: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204178.51423: stderr chunk (state=3): >>><<< 15980 1727204178.51436: stdout chunk (state=3): >>><<< 15980 1727204178.51462: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204178.51474: _low_level_execute_command(): starting 15980 1727204178.51490: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204178.3952875-18996-172765155783845/AnsiballZ_package_facts.py && sleep 0' 15980 1727204178.52289: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204178.52342: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204178.52380: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204178.52422: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204178.52513: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204179.14992: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"na<<< 15980 1727204179.15010: stdout chunk (state=3): >>>me": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40",<<< 15980 1727204179.15021: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-l<<< 15980 1727204179.15044: stdout chunk (state=3): >>>ibs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lib<<< 15980 1727204179.15053: stdout chunk (state=3): >>>xmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_<<< 15980 1727204179.15058: stdout chunk (state=3): >>>64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": <<< 15980 1727204179.15084: stdout chunk (state=3): >>>"x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "s<<< 15980 1727204179.15098: stdout chunk (state=3): >>>ource": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 15980 1727204179.16951: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 15980 1727204179.17020: stderr chunk (state=3): >>><<< 15980 1727204179.17024: stdout chunk (state=3): >>><<< 15980 1727204179.17084: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 15980 1727204179.19838: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204178.3952875-18996-172765155783845/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15980 1727204179.19856: _low_level_execute_command(): starting 15980 1727204179.19879: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204178.3952875-18996-172765155783845/ > /dev/null 2>&1 && sleep 0' 15980 1727204179.20465: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204179.20470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204179.20484: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15980 1727204179.20487: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204179.20542: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204179.20545: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204179.20548: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204179.20609: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204179.22555: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204179.22623: stderr chunk (state=3): >>><<< 15980 1727204179.22626: stdout chunk (state=3): >>><<< 15980 1727204179.22642: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204179.22649: handler run complete 15980 1727204179.23298: variable 'ansible_facts' from source: unknown 15980 1727204179.23699: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204179.25610: variable 'ansible_facts' from source: unknown 15980 1727204179.26046: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204179.26856: attempt loop complete, returning result 15980 1727204179.26861: _execute() done 15980 1727204179.26864: dumping result to json 15980 1727204179.27047: done dumping result, returning 15980 1727204179.27056: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [127b8e07-fff9-5f1d-4b72-0000000003e8] 15980 1727204179.27061: sending task result for task 127b8e07-fff9-5f1d-4b72-0000000003e8 15980 1727204179.29714: done sending task result for task 127b8e07-fff9-5f1d-4b72-0000000003e8 15980 1727204179.29719: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15980 1727204179.29875: no more pending results, returning what we have 15980 1727204179.29878: results queue empty 15980 1727204179.29879: checking for any_errors_fatal 15980 1727204179.29884: done checking for any_errors_fatal 15980 1727204179.29885: checking for max_fail_percentage 15980 1727204179.29887: done checking for max_fail_percentage 15980 1727204179.29888: checking to see if all hosts have failed and the running result is not ok 15980 1727204179.29889: done checking to see if all hosts have failed 15980 1727204179.29890: getting the remaining hosts for this loop 15980 1727204179.29891: done getting the remaining hosts for this loop 15980 1727204179.29895: getting the next task for host managed-node2 15980 1727204179.29902: done getting next task for host managed-node2 15980 1727204179.29906: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 15980 1727204179.29908: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204179.29917: getting variables 15980 1727204179.29918: in VariableManager get_vars() 15980 1727204179.29954: Calling all_inventory to load vars for managed-node2 15980 1727204179.29957: Calling groups_inventory to load vars for managed-node2 15980 1727204179.29959: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204179.30079: Calling all_plugins_play to load vars for managed-node2 15980 1727204179.30084: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204179.30089: Calling groups_plugins_play to load vars for managed-node2 15980 1727204179.31480: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204179.32683: done with get_vars() 15980 1727204179.32712: done getting variables 15980 1727204179.32767: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:56:19 -0400 (0:00:00.989) 0:00:40.738 ***** 15980 1727204179.32797: entering _queue_task() for managed-node2/debug 15980 1727204179.33187: worker is 1 (out of 1 available) 15980 1727204179.33201: exiting _queue_task() for managed-node2/debug 15980 1727204179.33217: done queuing things up, now waiting for results queue to drain 15980 1727204179.33219: waiting for pending results... 15980 1727204179.33597: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 15980 1727204179.33681: in run() - task 127b8e07-fff9-5f1d-4b72-00000000005b 15980 1727204179.33708: variable 'ansible_search_path' from source: unknown 15980 1727204179.33716: variable 'ansible_search_path' from source: unknown 15980 1727204179.33763: calling self._execute() 15980 1727204179.33908: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204179.33912: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204179.33915: variable 'omit' from source: magic vars 15980 1727204179.34571: variable 'ansible_distribution_major_version' from source: facts 15980 1727204179.34586: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204179.34599: variable 'omit' from source: magic vars 15980 1727204179.34659: variable 'omit' from source: magic vars 15980 1727204179.34857: variable 'network_provider' from source: set_fact 15980 1727204179.34861: variable 'omit' from source: magic vars 15980 1727204179.34890: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204179.34943: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204179.34982: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204179.35014: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204179.35039: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204179.35090: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204179.35105: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204179.35115: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204179.35241: Set connection var ansible_connection to ssh 15980 1727204179.35291: Set connection var ansible_pipelining to False 15980 1727204179.35294: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204179.35297: Set connection var ansible_timeout to 10 15980 1727204179.35299: Set connection var ansible_shell_type to sh 15980 1727204179.35301: Set connection var ansible_shell_executable to /bin/sh 15980 1727204179.35331: variable 'ansible_shell_executable' from source: unknown 15980 1727204179.35342: variable 'ansible_connection' from source: unknown 15980 1727204179.35351: variable 'ansible_module_compression' from source: unknown 15980 1727204179.35399: variable 'ansible_shell_type' from source: unknown 15980 1727204179.35403: variable 'ansible_shell_executable' from source: unknown 15980 1727204179.35405: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204179.35407: variable 'ansible_pipelining' from source: unknown 15980 1727204179.35410: variable 'ansible_timeout' from source: unknown 15980 1727204179.35411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204179.35578: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15980 1727204179.35600: variable 'omit' from source: magic vars 15980 1727204179.35618: starting attempt loop 15980 1727204179.35630: running the handler 15980 1727204179.35728: handler run complete 15980 1727204179.35731: attempt loop complete, returning result 15980 1727204179.35734: _execute() done 15980 1727204179.35736: dumping result to json 15980 1727204179.35739: done dumping result, returning 15980 1727204179.35741: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [127b8e07-fff9-5f1d-4b72-00000000005b] 15980 1727204179.35749: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000005b ok: [managed-node2] => {} MSG: Using network provider: nm 15980 1727204179.35938: no more pending results, returning what we have 15980 1727204179.35942: results queue empty 15980 1727204179.35943: checking for any_errors_fatal 15980 1727204179.35955: done checking for any_errors_fatal 15980 1727204179.35956: checking for max_fail_percentage 15980 1727204179.35958: done checking for max_fail_percentage 15980 1727204179.35959: checking to see if all hosts have failed and the running result is not ok 15980 1727204179.35960: done checking to see if all hosts have failed 15980 1727204179.35961: getting the remaining hosts for this loop 15980 1727204179.35963: done getting the remaining hosts for this loop 15980 1727204179.35970: getting the next task for host managed-node2 15980 1727204179.35977: done getting next task for host managed-node2 15980 1727204179.35982: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15980 1727204179.35984: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204179.35996: getting variables 15980 1727204179.35998: in VariableManager get_vars() 15980 1727204179.36047: Calling all_inventory to load vars for managed-node2 15980 1727204179.36051: Calling groups_inventory to load vars for managed-node2 15980 1727204179.36054: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204179.36370: Calling all_plugins_play to load vars for managed-node2 15980 1727204179.36376: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204179.36380: Calling groups_plugins_play to load vars for managed-node2 15980 1727204179.37109: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000005b 15980 1727204179.37114: WORKER PROCESS EXITING 15980 1727204179.38504: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204179.41547: done with get_vars() 15980 1727204179.41795: done getting variables 15980 1727204179.41868: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:56:19 -0400 (0:00:00.091) 0:00:40.829 ***** 15980 1727204179.41903: entering _queue_task() for managed-node2/fail 15980 1727204179.42747: worker is 1 (out of 1 available) 15980 1727204179.42760: exiting _queue_task() for managed-node2/fail 15980 1727204179.42775: done queuing things up, now waiting for results queue to drain 15980 1727204179.42778: waiting for pending results... 15980 1727204179.43286: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15980 1727204179.43430: in run() - task 127b8e07-fff9-5f1d-4b72-00000000005c 15980 1727204179.43455: variable 'ansible_search_path' from source: unknown 15980 1727204179.43463: variable 'ansible_search_path' from source: unknown 15980 1727204179.43516: calling self._execute() 15980 1727204179.43643: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204179.43656: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204179.43674: variable 'omit' from source: magic vars 15980 1727204179.44118: variable 'ansible_distribution_major_version' from source: facts 15980 1727204179.44145: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204179.44295: variable 'network_state' from source: role '' defaults 15980 1727204179.44312: Evaluated conditional (network_state != {}): False 15980 1727204179.44320: when evaluation is False, skipping this task 15980 1727204179.44329: _execute() done 15980 1727204179.44336: dumping result to json 15980 1727204179.44346: done dumping result, returning 15980 1727204179.44361: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [127b8e07-fff9-5f1d-4b72-00000000005c] 15980 1727204179.44462: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000005c 15980 1727204179.44547: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000005c 15980 1727204179.44555: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15980 1727204179.44607: no more pending results, returning what we have 15980 1727204179.44611: results queue empty 15980 1727204179.44612: checking for any_errors_fatal 15980 1727204179.44621: done checking for any_errors_fatal 15980 1727204179.44622: checking for max_fail_percentage 15980 1727204179.44623: done checking for max_fail_percentage 15980 1727204179.44624: checking to see if all hosts have failed and the running result is not ok 15980 1727204179.44625: done checking to see if all hosts have failed 15980 1727204179.44627: getting the remaining hosts for this loop 15980 1727204179.44629: done getting the remaining hosts for this loop 15980 1727204179.44633: getting the next task for host managed-node2 15980 1727204179.44638: done getting next task for host managed-node2 15980 1727204179.44642: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15980 1727204179.44645: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204179.44662: getting variables 15980 1727204179.44664: in VariableManager get_vars() 15980 1727204179.44709: Calling all_inventory to load vars for managed-node2 15980 1727204179.44712: Calling groups_inventory to load vars for managed-node2 15980 1727204179.44714: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204179.44727: Calling all_plugins_play to load vars for managed-node2 15980 1727204179.44731: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204179.44734: Calling groups_plugins_play to load vars for managed-node2 15980 1727204179.46573: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204179.49041: done with get_vars() 15980 1727204179.49071: done getting variables 15980 1727204179.49138: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:56:19 -0400 (0:00:00.072) 0:00:40.902 ***** 15980 1727204179.49173: entering _queue_task() for managed-node2/fail 15980 1727204179.49775: worker is 1 (out of 1 available) 15980 1727204179.49788: exiting _queue_task() for managed-node2/fail 15980 1727204179.49798: done queuing things up, now waiting for results queue to drain 15980 1727204179.49800: waiting for pending results... 15980 1727204179.49988: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15980 1727204179.50082: in run() - task 127b8e07-fff9-5f1d-4b72-00000000005d 15980 1727204179.50105: variable 'ansible_search_path' from source: unknown 15980 1727204179.50137: variable 'ansible_search_path' from source: unknown 15980 1727204179.50170: calling self._execute() 15980 1727204179.50297: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204179.50356: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204179.50360: variable 'omit' from source: magic vars 15980 1727204179.50800: variable 'ansible_distribution_major_version' from source: facts 15980 1727204179.50821: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204179.50975: variable 'network_state' from source: role '' defaults 15980 1727204179.50994: Evaluated conditional (network_state != {}): False 15980 1727204179.51001: when evaluation is False, skipping this task 15980 1727204179.51070: _execute() done 15980 1727204179.51074: dumping result to json 15980 1727204179.51077: done dumping result, returning 15980 1727204179.51080: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [127b8e07-fff9-5f1d-4b72-00000000005d] 15980 1727204179.51082: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000005d skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15980 1727204179.51283: no more pending results, returning what we have 15980 1727204179.51288: results queue empty 15980 1727204179.51289: checking for any_errors_fatal 15980 1727204179.51300: done checking for any_errors_fatal 15980 1727204179.51301: checking for max_fail_percentage 15980 1727204179.51303: done checking for max_fail_percentage 15980 1727204179.51304: checking to see if all hosts have failed and the running result is not ok 15980 1727204179.51305: done checking to see if all hosts have failed 15980 1727204179.51306: getting the remaining hosts for this loop 15980 1727204179.51308: done getting the remaining hosts for this loop 15980 1727204179.51312: getting the next task for host managed-node2 15980 1727204179.51319: done getting next task for host managed-node2 15980 1727204179.51324: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15980 1727204179.51329: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204179.51350: getting variables 15980 1727204179.51352: in VariableManager get_vars() 15980 1727204179.51400: Calling all_inventory to load vars for managed-node2 15980 1727204179.51403: Calling groups_inventory to load vars for managed-node2 15980 1727204179.51406: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204179.51420: Calling all_plugins_play to load vars for managed-node2 15980 1727204179.51423: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204179.51429: Calling groups_plugins_play to load vars for managed-node2 15980 1727204179.51985: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000005d 15980 1727204179.51990: WORKER PROCESS EXITING 15980 1727204179.53497: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204179.55862: done with get_vars() 15980 1727204179.55909: done getting variables 15980 1727204179.55980: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:56:19 -0400 (0:00:00.068) 0:00:40.970 ***** 15980 1727204179.56020: entering _queue_task() for managed-node2/fail 15980 1727204179.56715: worker is 1 (out of 1 available) 15980 1727204179.56735: exiting _queue_task() for managed-node2/fail 15980 1727204179.56747: done queuing things up, now waiting for results queue to drain 15980 1727204179.56750: waiting for pending results... 15980 1727204179.56985: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15980 1727204179.57161: in run() - task 127b8e07-fff9-5f1d-4b72-00000000005e 15980 1727204179.57203: variable 'ansible_search_path' from source: unknown 15980 1727204179.57226: variable 'ansible_search_path' from source: unknown 15980 1727204179.57290: calling self._execute() 15980 1727204179.57674: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204179.57682: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204179.57689: variable 'omit' from source: magic vars 15980 1727204179.58051: variable 'ansible_distribution_major_version' from source: facts 15980 1727204179.58073: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204179.58343: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15980 1727204179.60922: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15980 1727204179.61008: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15980 1727204179.61057: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15980 1727204179.61102: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15980 1727204179.61139: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15980 1727204179.61244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204179.62142: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204179.62474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204179.62480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204179.62483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204179.62613: variable 'ansible_distribution_major_version' from source: facts 15980 1727204179.62810: Evaluated conditional (ansible_distribution_major_version | int > 9): True 15980 1727204179.62989: variable 'ansible_distribution' from source: facts 15980 1727204179.62999: variable '__network_rh_distros' from source: role '' defaults 15980 1727204179.63015: Evaluated conditional (ansible_distribution in __network_rh_distros): False 15980 1727204179.63023: when evaluation is False, skipping this task 15980 1727204179.63035: _execute() done 15980 1727204179.63039: dumping result to json 15980 1727204179.63043: done dumping result, returning 15980 1727204179.63049: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [127b8e07-fff9-5f1d-4b72-00000000005e] 15980 1727204179.63080: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000005e skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 15980 1727204179.63242: no more pending results, returning what we have 15980 1727204179.63246: results queue empty 15980 1727204179.63247: checking for any_errors_fatal 15980 1727204179.63253: done checking for any_errors_fatal 15980 1727204179.63254: checking for max_fail_percentage 15980 1727204179.63256: done checking for max_fail_percentage 15980 1727204179.63257: checking to see if all hosts have failed and the running result is not ok 15980 1727204179.63258: done checking to see if all hosts have failed 15980 1727204179.63259: getting the remaining hosts for this loop 15980 1727204179.63275: done getting the remaining hosts for this loop 15980 1727204179.63280: getting the next task for host managed-node2 15980 1727204179.63287: done getting next task for host managed-node2 15980 1727204179.63291: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15980 1727204179.63293: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204179.63307: getting variables 15980 1727204179.63309: in VariableManager get_vars() 15980 1727204179.63350: Calling all_inventory to load vars for managed-node2 15980 1727204179.63353: Calling groups_inventory to load vars for managed-node2 15980 1727204179.63355: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204179.63491: Calling all_plugins_play to load vars for managed-node2 15980 1727204179.63496: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204179.63503: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000005e 15980 1727204179.63505: WORKER PROCESS EXITING 15980 1727204179.63509: Calling groups_plugins_play to load vars for managed-node2 15980 1727204179.65493: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204179.68060: done with get_vars() 15980 1727204179.68305: done getting variables 15980 1727204179.68363: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:56:19 -0400 (0:00:00.123) 0:00:41.094 ***** 15980 1727204179.68397: entering _queue_task() for managed-node2/dnf 15980 1727204179.69190: worker is 1 (out of 1 available) 15980 1727204179.69205: exiting _queue_task() for managed-node2/dnf 15980 1727204179.69218: done queuing things up, now waiting for results queue to drain 15980 1727204179.69220: waiting for pending results... 15980 1727204179.69986: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15980 1727204179.69993: in run() - task 127b8e07-fff9-5f1d-4b72-00000000005f 15980 1727204179.70373: variable 'ansible_search_path' from source: unknown 15980 1727204179.70378: variable 'ansible_search_path' from source: unknown 15980 1727204179.70382: calling self._execute() 15980 1727204179.70385: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204179.70388: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204179.70392: variable 'omit' from source: magic vars 15980 1727204179.71363: variable 'ansible_distribution_major_version' from source: facts 15980 1727204179.71488: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204179.71924: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15980 1727204179.77079: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15980 1727204179.77164: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15980 1727204179.77472: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15980 1727204179.77476: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15980 1727204179.77492: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15980 1727204179.77596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204179.77823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204179.77860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204179.77910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204179.77936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204179.78281: variable 'ansible_distribution' from source: facts 15980 1727204179.78291: variable 'ansible_distribution_major_version' from source: facts 15980 1727204179.78305: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 15980 1727204179.78446: variable '__network_wireless_connections_defined' from source: role '' defaults 15980 1727204179.78820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204179.79271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204179.79275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204179.79278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204179.79281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204179.79283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204179.79286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204179.79313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204179.79362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204179.79672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204179.79675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204179.79678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204179.79704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204179.79754: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204179.80072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204179.80183: variable 'network_connections' from source: play vars 15980 1727204179.80203: variable 'profile' from source: play vars 15980 1727204179.80349: variable 'profile' from source: play vars 15980 1727204179.80479: variable 'interface' from source: set_fact 15980 1727204179.80558: variable 'interface' from source: set_fact 15980 1727204179.80668: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15980 1727204179.80868: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15980 1727204179.80917: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15980 1727204179.80956: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15980 1727204179.80993: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15980 1727204179.81048: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15980 1727204179.81077: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15980 1727204179.81110: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204179.81142: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15980 1727204179.81196: variable '__network_team_connections_defined' from source: role '' defaults 15980 1727204179.81546: variable 'network_connections' from source: play vars 15980 1727204179.81552: variable 'profile' from source: play vars 15980 1727204179.81636: variable 'profile' from source: play vars 15980 1727204179.81646: variable 'interface' from source: set_fact 15980 1727204179.81728: variable 'interface' from source: set_fact 15980 1727204179.81761: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15980 1727204179.81772: when evaluation is False, skipping this task 15980 1727204179.81780: _execute() done 15980 1727204179.81788: dumping result to json 15980 1727204179.81800: done dumping result, returning 15980 1727204179.81813: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [127b8e07-fff9-5f1d-4b72-00000000005f] 15980 1727204179.81825: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000005f 15980 1727204179.82097: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000005f 15980 1727204179.82100: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15980 1727204179.82160: no more pending results, returning what we have 15980 1727204179.82167: results queue empty 15980 1727204179.82169: checking for any_errors_fatal 15980 1727204179.82178: done checking for any_errors_fatal 15980 1727204179.82179: checking for max_fail_percentage 15980 1727204179.82180: done checking for max_fail_percentage 15980 1727204179.82181: checking to see if all hosts have failed and the running result is not ok 15980 1727204179.82183: done checking to see if all hosts have failed 15980 1727204179.82183: getting the remaining hosts for this loop 15980 1727204179.82186: done getting the remaining hosts for this loop 15980 1727204179.82190: getting the next task for host managed-node2 15980 1727204179.82197: done getting next task for host managed-node2 15980 1727204179.82202: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15980 1727204179.82205: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204179.82227: getting variables 15980 1727204179.82229: in VariableManager get_vars() 15980 1727204179.82338: Calling all_inventory to load vars for managed-node2 15980 1727204179.82341: Calling groups_inventory to load vars for managed-node2 15980 1727204179.82344: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204179.82356: Calling all_plugins_play to load vars for managed-node2 15980 1727204179.82359: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204179.82362: Calling groups_plugins_play to load vars for managed-node2 15980 1727204179.84848: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204179.87693: done with get_vars() 15980 1727204179.87728: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15980 1727204179.87891: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:56:19 -0400 (0:00:00.195) 0:00:41.289 ***** 15980 1727204179.87922: entering _queue_task() for managed-node2/yum 15980 1727204179.88520: worker is 1 (out of 1 available) 15980 1727204179.88650: exiting _queue_task() for managed-node2/yum 15980 1727204179.88662: done queuing things up, now waiting for results queue to drain 15980 1727204179.88664: waiting for pending results... 15980 1727204179.89386: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15980 1727204179.89429: in run() - task 127b8e07-fff9-5f1d-4b72-000000000060 15980 1727204179.89460: variable 'ansible_search_path' from source: unknown 15980 1727204179.89481: variable 'ansible_search_path' from source: unknown 15980 1727204179.89529: calling self._execute() 15980 1727204179.89647: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204179.89660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204179.89678: variable 'omit' from source: magic vars 15980 1727204179.90111: variable 'ansible_distribution_major_version' from source: facts 15980 1727204179.90134: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204179.90344: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15980 1727204179.94389: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15980 1727204179.94642: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15980 1727204179.94972: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15980 1727204179.94976: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15980 1727204179.94996: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15980 1727204179.95374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204179.95378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204179.95390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204179.95445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204179.95503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204179.95788: variable 'ansible_distribution_major_version' from source: facts 15980 1727204179.95853: Evaluated conditional (ansible_distribution_major_version | int < 8): False 15980 1727204179.95885: when evaluation is False, skipping this task 15980 1727204179.95893: _execute() done 15980 1727204179.95945: dumping result to json 15980 1727204179.95954: done dumping result, returning 15980 1727204179.95970: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [127b8e07-fff9-5f1d-4b72-000000000060] 15980 1727204179.96027: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000060 15980 1727204179.96284: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000060 15980 1727204179.96288: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 15980 1727204179.96350: no more pending results, returning what we have 15980 1727204179.96355: results queue empty 15980 1727204179.96357: checking for any_errors_fatal 15980 1727204179.96366: done checking for any_errors_fatal 15980 1727204179.96367: checking for max_fail_percentage 15980 1727204179.96369: done checking for max_fail_percentage 15980 1727204179.96370: checking to see if all hosts have failed and the running result is not ok 15980 1727204179.96371: done checking to see if all hosts have failed 15980 1727204179.96372: getting the remaining hosts for this loop 15980 1727204179.96374: done getting the remaining hosts for this loop 15980 1727204179.96379: getting the next task for host managed-node2 15980 1727204179.96386: done getting next task for host managed-node2 15980 1727204179.96391: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15980 1727204179.96393: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204179.96412: getting variables 15980 1727204179.96414: in VariableManager get_vars() 15980 1727204179.96462: Calling all_inventory to load vars for managed-node2 15980 1727204179.96789: Calling groups_inventory to load vars for managed-node2 15980 1727204179.96794: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204179.96805: Calling all_plugins_play to load vars for managed-node2 15980 1727204179.96810: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204179.96813: Calling groups_plugins_play to load vars for managed-node2 15980 1727204179.98684: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204180.02215: done with get_vars() 15980 1727204180.02259: done getting variables 15980 1727204180.02331: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:56:20 -0400 (0:00:00.144) 0:00:41.433 ***** 15980 1727204180.02368: entering _queue_task() for managed-node2/fail 15980 1727204180.02762: worker is 1 (out of 1 available) 15980 1727204180.02781: exiting _queue_task() for managed-node2/fail 15980 1727204180.02796: done queuing things up, now waiting for results queue to drain 15980 1727204180.02798: waiting for pending results... 15980 1727204180.03114: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15980 1727204180.03319: in run() - task 127b8e07-fff9-5f1d-4b72-000000000061 15980 1727204180.03323: variable 'ansible_search_path' from source: unknown 15980 1727204180.03326: variable 'ansible_search_path' from source: unknown 15980 1727204180.03330: calling self._execute() 15980 1727204180.03423: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204180.03540: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204180.03544: variable 'omit' from source: magic vars 15980 1727204180.04479: variable 'ansible_distribution_major_version' from source: facts 15980 1727204180.04623: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204180.04874: variable '__network_wireless_connections_defined' from source: role '' defaults 15980 1727204180.05108: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15980 1727204180.09992: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15980 1727204180.10080: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15980 1727204180.10135: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15980 1727204180.10177: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15980 1727204180.10206: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15980 1727204180.10308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204180.10803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204180.10871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204180.10902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204180.10923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204180.11070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204180.11074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204180.11076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204180.11103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204180.11123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204180.11225: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204180.11255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204180.11287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204180.11341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204180.11361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204180.11579: variable 'network_connections' from source: play vars 15980 1727204180.11597: variable 'profile' from source: play vars 15980 1727204180.11687: variable 'profile' from source: play vars 15980 1727204180.11696: variable 'interface' from source: set_fact 15980 1727204180.11777: variable 'interface' from source: set_fact 15980 1727204180.11869: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15980 1727204180.12080: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15980 1727204180.12119: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15980 1727204180.12183: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15980 1727204180.12199: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15980 1727204180.12252: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15980 1727204180.12281: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15980 1727204180.12319: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204180.12399: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15980 1727204180.12416: variable '__network_team_connections_defined' from source: role '' defaults 15980 1727204180.13107: variable 'network_connections' from source: play vars 15980 1727204180.13209: variable 'profile' from source: play vars 15980 1727204180.13416: variable 'profile' from source: play vars 15980 1727204180.13491: variable 'interface' from source: set_fact 15980 1727204180.13767: variable 'interface' from source: set_fact 15980 1727204180.13839: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15980 1727204180.13874: when evaluation is False, skipping this task 15980 1727204180.13878: _execute() done 15980 1727204180.13880: dumping result to json 15980 1727204180.13937: done dumping result, returning 15980 1727204180.13940: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-5f1d-4b72-000000000061] 15980 1727204180.13950: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000061 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15980 1727204180.14357: no more pending results, returning what we have 15980 1727204180.14396: results queue empty 15980 1727204180.14398: checking for any_errors_fatal 15980 1727204180.14405: done checking for any_errors_fatal 15980 1727204180.14406: checking for max_fail_percentage 15980 1727204180.14447: done checking for max_fail_percentage 15980 1727204180.14449: checking to see if all hosts have failed and the running result is not ok 15980 1727204180.14451: done checking to see if all hosts have failed 15980 1727204180.14452: getting the remaining hosts for this loop 15980 1727204180.14454: done getting the remaining hosts for this loop 15980 1727204180.14459: getting the next task for host managed-node2 15980 1727204180.14467: done getting next task for host managed-node2 15980 1727204180.14472: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 15980 1727204180.14474: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204180.14491: getting variables 15980 1727204180.14493: in VariableManager get_vars() 15980 1727204180.14663: Calling all_inventory to load vars for managed-node2 15980 1727204180.14728: Calling groups_inventory to load vars for managed-node2 15980 1727204180.14731: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204180.14738: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000061 15980 1727204180.14740: WORKER PROCESS EXITING 15980 1727204180.14776: Calling all_plugins_play to load vars for managed-node2 15980 1727204180.14780: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204180.14784: Calling groups_plugins_play to load vars for managed-node2 15980 1727204180.17627: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204180.20425: done with get_vars() 15980 1727204180.20491: done getting variables 15980 1727204180.20655: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:56:20 -0400 (0:00:00.183) 0:00:41.617 ***** 15980 1727204180.20715: entering _queue_task() for managed-node2/package 15980 1727204180.21319: worker is 1 (out of 1 available) 15980 1727204180.21344: exiting _queue_task() for managed-node2/package 15980 1727204180.21358: done queuing things up, now waiting for results queue to drain 15980 1727204180.21360: waiting for pending results... 15980 1727204180.21616: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 15980 1727204180.21761: in run() - task 127b8e07-fff9-5f1d-4b72-000000000062 15980 1727204180.21792: variable 'ansible_search_path' from source: unknown 15980 1727204180.21871: variable 'ansible_search_path' from source: unknown 15980 1727204180.21875: calling self._execute() 15980 1727204180.21989: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204180.22000: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204180.22024: variable 'omit' from source: magic vars 15980 1727204180.22494: variable 'ansible_distribution_major_version' from source: facts 15980 1727204180.22527: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204180.22783: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15980 1727204180.23111: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15980 1727204180.23172: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15980 1727204180.23272: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15980 1727204180.23298: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15980 1727204180.23439: variable 'network_packages' from source: role '' defaults 15980 1727204180.23597: variable '__network_provider_setup' from source: role '' defaults 15980 1727204180.23859: variable '__network_service_name_default_nm' from source: role '' defaults 15980 1727204180.23863: variable '__network_service_name_default_nm' from source: role '' defaults 15980 1727204180.23869: variable '__network_packages_default_nm' from source: role '' defaults 15980 1727204180.23894: variable '__network_packages_default_nm' from source: role '' defaults 15980 1727204180.24169: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15980 1727204180.27149: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15980 1727204180.27379: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15980 1727204180.27430: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15980 1727204180.27504: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15980 1727204180.27533: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15980 1727204180.27740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204180.27807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204180.27829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204180.27872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204180.27887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204180.27945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204180.27967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204180.27989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204180.28036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204180.28051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204180.28281: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15980 1727204180.28387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204180.28405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204180.28427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204180.28458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204180.28472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204180.28576: variable 'ansible_python' from source: facts 15980 1727204180.28587: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15980 1727204180.28671: variable '__network_wpa_supplicant_required' from source: role '' defaults 15980 1727204180.28757: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15980 1727204180.28902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204180.28918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204180.28940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204180.28970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204180.28982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204180.29035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204180.29053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204180.29074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204180.29101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204180.29112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204180.29278: variable 'network_connections' from source: play vars 15980 1727204180.29282: variable 'profile' from source: play vars 15980 1727204180.29399: variable 'profile' from source: play vars 15980 1727204180.29402: variable 'interface' from source: set_fact 15980 1727204180.29501: variable 'interface' from source: set_fact 15980 1727204180.29635: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15980 1727204180.29638: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15980 1727204180.29684: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204180.29695: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15980 1727204180.29977: variable '__network_wireless_connections_defined' from source: role '' defaults 15980 1727204180.30303: variable 'network_connections' from source: play vars 15980 1727204180.30426: variable 'profile' from source: play vars 15980 1727204180.30522: variable 'profile' from source: play vars 15980 1727204180.30546: variable 'interface' from source: set_fact 15980 1727204180.30725: variable 'interface' from source: set_fact 15980 1727204180.30793: variable '__network_packages_default_wireless' from source: role '' defaults 15980 1727204180.30950: variable '__network_wireless_connections_defined' from source: role '' defaults 15980 1727204180.31221: variable 'network_connections' from source: play vars 15980 1727204180.31224: variable 'profile' from source: play vars 15980 1727204180.31276: variable 'profile' from source: play vars 15980 1727204180.31280: variable 'interface' from source: set_fact 15980 1727204180.31354: variable 'interface' from source: set_fact 15980 1727204180.31377: variable '__network_packages_default_team' from source: role '' defaults 15980 1727204180.31437: variable '__network_team_connections_defined' from source: role '' defaults 15980 1727204180.31651: variable 'network_connections' from source: play vars 15980 1727204180.31654: variable 'profile' from source: play vars 15980 1727204180.31715: variable 'profile' from source: play vars 15980 1727204180.31720: variable 'interface' from source: set_fact 15980 1727204180.31794: variable 'interface' from source: set_fact 15980 1727204180.31841: variable '__network_service_name_default_initscripts' from source: role '' defaults 15980 1727204180.31888: variable '__network_service_name_default_initscripts' from source: role '' defaults 15980 1727204180.31895: variable '__network_packages_default_initscripts' from source: role '' defaults 15980 1727204180.31944: variable '__network_packages_default_initscripts' from source: role '' defaults 15980 1727204180.32095: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15980 1727204180.32489: variable 'network_connections' from source: play vars 15980 1727204180.32492: variable 'profile' from source: play vars 15980 1727204180.32560: variable 'profile' from source: play vars 15980 1727204180.32564: variable 'interface' from source: set_fact 15980 1727204180.32616: variable 'interface' from source: set_fact 15980 1727204180.32625: variable 'ansible_distribution' from source: facts 15980 1727204180.32631: variable '__network_rh_distros' from source: role '' defaults 15980 1727204180.32637: variable 'ansible_distribution_major_version' from source: facts 15980 1727204180.32659: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15980 1727204180.32811: variable 'ansible_distribution' from source: facts 15980 1727204180.32814: variable '__network_rh_distros' from source: role '' defaults 15980 1727204180.32820: variable 'ansible_distribution_major_version' from source: facts 15980 1727204180.32827: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15980 1727204180.32982: variable 'ansible_distribution' from source: facts 15980 1727204180.32985: variable '__network_rh_distros' from source: role '' defaults 15980 1727204180.32988: variable 'ansible_distribution_major_version' from source: facts 15980 1727204180.33019: variable 'network_provider' from source: set_fact 15980 1727204180.33033: variable 'ansible_facts' from source: unknown 15980 1727204180.34745: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 15980 1727204180.34749: when evaluation is False, skipping this task 15980 1727204180.34752: _execute() done 15980 1727204180.34754: dumping result to json 15980 1727204180.34757: done dumping result, returning 15980 1727204180.34760: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [127b8e07-fff9-5f1d-4b72-000000000062] 15980 1727204180.34762: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000062 skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 15980 1727204180.34932: no more pending results, returning what we have 15980 1727204180.34938: results queue empty 15980 1727204180.34939: checking for any_errors_fatal 15980 1727204180.34947: done checking for any_errors_fatal 15980 1727204180.34948: checking for max_fail_percentage 15980 1727204180.34949: done checking for max_fail_percentage 15980 1727204180.34950: checking to see if all hosts have failed and the running result is not ok 15980 1727204180.34951: done checking to see if all hosts have failed 15980 1727204180.34952: getting the remaining hosts for this loop 15980 1727204180.34954: done getting the remaining hosts for this loop 15980 1727204180.34959: getting the next task for host managed-node2 15980 1727204180.34966: done getting next task for host managed-node2 15980 1727204180.34970: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15980 1727204180.34973: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204180.34989: getting variables 15980 1727204180.34990: in VariableManager get_vars() 15980 1727204180.35032: Calling all_inventory to load vars for managed-node2 15980 1727204180.35036: Calling groups_inventory to load vars for managed-node2 15980 1727204180.35038: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204180.35048: Calling all_plugins_play to load vars for managed-node2 15980 1727204180.35056: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204180.35059: Calling groups_plugins_play to load vars for managed-node2 15980 1727204180.35115: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000062 15980 1727204180.35119: WORKER PROCESS EXITING 15980 1727204180.36613: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204180.38938: done with get_vars() 15980 1727204180.38979: done getting variables 15980 1727204180.39050: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:56:20 -0400 (0:00:00.183) 0:00:41.801 ***** 15980 1727204180.39082: entering _queue_task() for managed-node2/package 15980 1727204180.39405: worker is 1 (out of 1 available) 15980 1727204180.39422: exiting _queue_task() for managed-node2/package 15980 1727204180.39440: done queuing things up, now waiting for results queue to drain 15980 1727204180.39442: waiting for pending results... 15980 1727204180.39643: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15980 1727204180.39721: in run() - task 127b8e07-fff9-5f1d-4b72-000000000063 15980 1727204180.39735: variable 'ansible_search_path' from source: unknown 15980 1727204180.39739: variable 'ansible_search_path' from source: unknown 15980 1727204180.39774: calling self._execute() 15980 1727204180.39870: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204180.39874: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204180.39884: variable 'omit' from source: magic vars 15980 1727204180.40357: variable 'ansible_distribution_major_version' from source: facts 15980 1727204180.40368: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204180.40462: variable 'network_state' from source: role '' defaults 15980 1727204180.40484: Evaluated conditional (network_state != {}): False 15980 1727204180.40487: when evaluation is False, skipping this task 15980 1727204180.40491: _execute() done 15980 1727204180.40493: dumping result to json 15980 1727204180.40496: done dumping result, returning 15980 1727204180.40499: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [127b8e07-fff9-5f1d-4b72-000000000063] 15980 1727204180.40502: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000063 15980 1727204180.40612: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000063 15980 1727204180.40615: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15980 1727204180.40682: no more pending results, returning what we have 15980 1727204180.40687: results queue empty 15980 1727204180.40689: checking for any_errors_fatal 15980 1727204180.40697: done checking for any_errors_fatal 15980 1727204180.40698: checking for max_fail_percentage 15980 1727204180.40700: done checking for max_fail_percentage 15980 1727204180.40701: checking to see if all hosts have failed and the running result is not ok 15980 1727204180.40702: done checking to see if all hosts have failed 15980 1727204180.40703: getting the remaining hosts for this loop 15980 1727204180.40705: done getting the remaining hosts for this loop 15980 1727204180.40709: getting the next task for host managed-node2 15980 1727204180.40715: done getting next task for host managed-node2 15980 1727204180.40722: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15980 1727204180.40726: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204180.40748: getting variables 15980 1727204180.40751: in VariableManager get_vars() 15980 1727204180.40801: Calling all_inventory to load vars for managed-node2 15980 1727204180.40804: Calling groups_inventory to load vars for managed-node2 15980 1727204180.40808: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204180.40821: Calling all_plugins_play to load vars for managed-node2 15980 1727204180.40824: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204180.40828: Calling groups_plugins_play to load vars for managed-node2 15980 1727204180.43311: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204180.47035: done with get_vars() 15980 1727204180.47125: done getting variables 15980 1727204180.47463: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:56:20 -0400 (0:00:00.084) 0:00:41.885 ***** 15980 1727204180.47512: entering _queue_task() for managed-node2/package 15980 1727204180.48929: worker is 1 (out of 1 available) 15980 1727204180.48945: exiting _queue_task() for managed-node2/package 15980 1727204180.49085: done queuing things up, now waiting for results queue to drain 15980 1727204180.49094: waiting for pending results... 15980 1727204180.49775: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15980 1727204180.50228: in run() - task 127b8e07-fff9-5f1d-4b72-000000000064 15980 1727204180.50248: variable 'ansible_search_path' from source: unknown 15980 1727204180.50252: variable 'ansible_search_path' from source: unknown 15980 1727204180.50309: calling self._execute() 15980 1727204180.50655: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204180.50660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204180.50670: variable 'omit' from source: magic vars 15980 1727204180.51417: variable 'ansible_distribution_major_version' from source: facts 15980 1727204180.51432: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204180.51787: variable 'network_state' from source: role '' defaults 15980 1727204180.51801: Evaluated conditional (network_state != {}): False 15980 1727204180.51805: when evaluation is False, skipping this task 15980 1727204180.51808: _execute() done 15980 1727204180.51811: dumping result to json 15980 1727204180.51813: done dumping result, returning 15980 1727204180.51852: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [127b8e07-fff9-5f1d-4b72-000000000064] 15980 1727204180.51861: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000064 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15980 1727204180.52214: no more pending results, returning what we have 15980 1727204180.52219: results queue empty 15980 1727204180.52220: checking for any_errors_fatal 15980 1727204180.52230: done checking for any_errors_fatal 15980 1727204180.52232: checking for max_fail_percentage 15980 1727204180.52234: done checking for max_fail_percentage 15980 1727204180.52235: checking to see if all hosts have failed and the running result is not ok 15980 1727204180.52236: done checking to see if all hosts have failed 15980 1727204180.52237: getting the remaining hosts for this loop 15980 1727204180.52239: done getting the remaining hosts for this loop 15980 1727204180.52244: getting the next task for host managed-node2 15980 1727204180.52251: done getting next task for host managed-node2 15980 1727204180.52258: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15980 1727204180.52261: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204180.52282: getting variables 15980 1727204180.52284: in VariableManager get_vars() 15980 1727204180.52328: Calling all_inventory to load vars for managed-node2 15980 1727204180.52331: Calling groups_inventory to load vars for managed-node2 15980 1727204180.52333: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204180.52348: Calling all_plugins_play to load vars for managed-node2 15980 1727204180.52351: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204180.52353: Calling groups_plugins_play to load vars for managed-node2 15980 1727204180.53025: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000064 15980 1727204180.53031: WORKER PROCESS EXITING 15980 1727204180.55727: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204180.57797: done with get_vars() 15980 1727204180.57842: done getting variables 15980 1727204180.57908: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:56:20 -0400 (0:00:00.104) 0:00:41.989 ***** 15980 1727204180.57935: entering _queue_task() for managed-node2/service 15980 1727204180.58304: worker is 1 (out of 1 available) 15980 1727204180.58319: exiting _queue_task() for managed-node2/service 15980 1727204180.58333: done queuing things up, now waiting for results queue to drain 15980 1727204180.58335: waiting for pending results... 15980 1727204180.58593: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15980 1727204180.58671: in run() - task 127b8e07-fff9-5f1d-4b72-000000000065 15980 1727204180.58684: variable 'ansible_search_path' from source: unknown 15980 1727204180.58687: variable 'ansible_search_path' from source: unknown 15980 1727204180.58734: calling self._execute() 15980 1727204180.58826: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204180.58835: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204180.58844: variable 'omit' from source: magic vars 15980 1727204180.59285: variable 'ansible_distribution_major_version' from source: facts 15980 1727204180.59289: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204180.59402: variable '__network_wireless_connections_defined' from source: role '' defaults 15980 1727204180.59737: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15980 1727204180.62230: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15980 1727204180.62316: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15980 1727204180.62379: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15980 1727204180.62402: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15980 1727204180.62420: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15980 1727204180.62541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204180.62561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204180.62586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204180.62639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204180.62685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204180.62758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204180.62761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204180.62879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204180.62883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204180.62886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204180.62888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204180.62923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204180.62972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204180.63015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204180.63037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204180.63286: variable 'network_connections' from source: play vars 15980 1727204180.63300: variable 'profile' from source: play vars 15980 1727204180.63472: variable 'profile' from source: play vars 15980 1727204180.63476: variable 'interface' from source: set_fact 15980 1727204180.63505: variable 'interface' from source: set_fact 15980 1727204180.63733: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15980 1727204180.64049: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15980 1727204180.64102: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15980 1727204180.64143: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15980 1727204180.64174: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15980 1727204180.64240: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15980 1727204180.64263: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15980 1727204180.64472: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204180.64475: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15980 1727204180.64478: variable '__network_team_connections_defined' from source: role '' defaults 15980 1727204180.64708: variable 'network_connections' from source: play vars 15980 1727204180.64715: variable 'profile' from source: play vars 15980 1727204180.64798: variable 'profile' from source: play vars 15980 1727204180.64801: variable 'interface' from source: set_fact 15980 1727204180.64877: variable 'interface' from source: set_fact 15980 1727204180.64906: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15980 1727204180.64909: when evaluation is False, skipping this task 15980 1727204180.64912: _execute() done 15980 1727204180.64915: dumping result to json 15980 1727204180.64917: done dumping result, returning 15980 1727204180.64929: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-5f1d-4b72-000000000065] 15980 1727204180.64946: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000065 15980 1727204180.65049: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000065 15980 1727204180.65052: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15980 1727204180.65105: no more pending results, returning what we have 15980 1727204180.65110: results queue empty 15980 1727204180.65111: checking for any_errors_fatal 15980 1727204180.65120: done checking for any_errors_fatal 15980 1727204180.65121: checking for max_fail_percentage 15980 1727204180.65123: done checking for max_fail_percentage 15980 1727204180.65124: checking to see if all hosts have failed and the running result is not ok 15980 1727204180.65125: done checking to see if all hosts have failed 15980 1727204180.65126: getting the remaining hosts for this loop 15980 1727204180.65128: done getting the remaining hosts for this loop 15980 1727204180.65133: getting the next task for host managed-node2 15980 1727204180.65145: done getting next task for host managed-node2 15980 1727204180.65150: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15980 1727204180.65152: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204180.65173: getting variables 15980 1727204180.65175: in VariableManager get_vars() 15980 1727204180.65220: Calling all_inventory to load vars for managed-node2 15980 1727204180.65223: Calling groups_inventory to load vars for managed-node2 15980 1727204180.65226: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204180.65238: Calling all_plugins_play to load vars for managed-node2 15980 1727204180.65241: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204180.65245: Calling groups_plugins_play to load vars for managed-node2 15980 1727204180.67415: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204180.69894: done with get_vars() 15980 1727204180.69926: done getting variables 15980 1727204180.70006: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:56:20 -0400 (0:00:00.121) 0:00:42.110 ***** 15980 1727204180.70041: entering _queue_task() for managed-node2/service 15980 1727204180.70523: worker is 1 (out of 1 available) 15980 1727204180.70539: exiting _queue_task() for managed-node2/service 15980 1727204180.70551: done queuing things up, now waiting for results queue to drain 15980 1727204180.70555: waiting for pending results... 15980 1727204180.70849: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15980 1727204180.71174: in run() - task 127b8e07-fff9-5f1d-4b72-000000000066 15980 1727204180.71180: variable 'ansible_search_path' from source: unknown 15980 1727204180.71183: variable 'ansible_search_path' from source: unknown 15980 1727204180.71187: calling self._execute() 15980 1727204180.71191: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204180.71199: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204180.71212: variable 'omit' from source: magic vars 15980 1727204180.71685: variable 'ansible_distribution_major_version' from source: facts 15980 1727204180.71699: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204180.71909: variable 'network_provider' from source: set_fact 15980 1727204180.71913: variable 'network_state' from source: role '' defaults 15980 1727204180.71930: Evaluated conditional (network_provider == "nm" or network_state != {}): True 15980 1727204180.71939: variable 'omit' from source: magic vars 15980 1727204180.71996: variable 'omit' from source: magic vars 15980 1727204180.72031: variable 'network_service_name' from source: role '' defaults 15980 1727204180.72111: variable 'network_service_name' from source: role '' defaults 15980 1727204180.72224: variable '__network_provider_setup' from source: role '' defaults 15980 1727204180.72230: variable '__network_service_name_default_nm' from source: role '' defaults 15980 1727204180.72307: variable '__network_service_name_default_nm' from source: role '' defaults 15980 1727204180.72316: variable '__network_packages_default_nm' from source: role '' defaults 15980 1727204180.72391: variable '__network_packages_default_nm' from source: role '' defaults 15980 1727204180.72659: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15980 1727204180.75289: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15980 1727204180.75370: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15980 1727204180.75408: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15980 1727204180.75572: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15980 1727204180.75575: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15980 1727204180.75595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204180.75630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204180.75659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204180.75712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204180.75728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204180.75787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204180.75810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204180.75834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204180.75881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204180.75901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204180.76181: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15980 1727204180.76335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204180.76359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204180.76472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204180.76475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204180.76478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204180.76561: variable 'ansible_python' from source: facts 15980 1727204180.76586: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15980 1727204180.76692: variable '__network_wpa_supplicant_required' from source: role '' defaults 15980 1727204180.76792: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15980 1727204180.76948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204180.76986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204180.77013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204180.77057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204180.77079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204180.77136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204180.77272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204180.77275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204180.77279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204180.77282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204180.77419: variable 'network_connections' from source: play vars 15980 1727204180.77429: variable 'profile' from source: play vars 15980 1727204180.77513: variable 'profile' from source: play vars 15980 1727204180.77528: variable 'interface' from source: set_fact 15980 1727204180.77589: variable 'interface' from source: set_fact 15980 1727204180.77715: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15980 1727204180.77936: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15980 1727204180.78000: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15980 1727204180.78045: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15980 1727204180.78113: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15980 1727204180.78191: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15980 1727204180.78222: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15980 1727204180.78255: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204180.78302: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15980 1727204180.78373: variable '__network_wireless_connections_defined' from source: role '' defaults 15980 1727204180.78705: variable 'network_connections' from source: play vars 15980 1727204180.78714: variable 'profile' from source: play vars 15980 1727204180.78802: variable 'profile' from source: play vars 15980 1727204180.78806: variable 'interface' from source: set_fact 15980 1727204180.78887: variable 'interface' from source: set_fact 15980 1727204180.78928: variable '__network_packages_default_wireless' from source: role '' defaults 15980 1727204180.79023: variable '__network_wireless_connections_defined' from source: role '' defaults 15980 1727204180.79375: variable 'network_connections' from source: play vars 15980 1727204180.79379: variable 'profile' from source: play vars 15980 1727204180.79500: variable 'profile' from source: play vars 15980 1727204180.79504: variable 'interface' from source: set_fact 15980 1727204180.79554: variable 'interface' from source: set_fact 15980 1727204180.79582: variable '__network_packages_default_team' from source: role '' defaults 15980 1727204180.79669: variable '__network_team_connections_defined' from source: role '' defaults 15980 1727204180.79904: variable 'network_connections' from source: play vars 15980 1727204180.79908: variable 'profile' from source: play vars 15980 1727204180.79975: variable 'profile' from source: play vars 15980 1727204180.79978: variable 'interface' from source: set_fact 15980 1727204180.80033: variable 'interface' from source: set_fact 15980 1727204180.80081: variable '__network_service_name_default_initscripts' from source: role '' defaults 15980 1727204180.80124: variable '__network_service_name_default_initscripts' from source: role '' defaults 15980 1727204180.80131: variable '__network_packages_default_initscripts' from source: role '' defaults 15980 1727204180.80177: variable '__network_packages_default_initscripts' from source: role '' defaults 15980 1727204180.80327: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15980 1727204180.80667: variable 'network_connections' from source: play vars 15980 1727204180.80671: variable 'profile' from source: play vars 15980 1727204180.80718: variable 'profile' from source: play vars 15980 1727204180.80722: variable 'interface' from source: set_fact 15980 1727204180.80774: variable 'interface' from source: set_fact 15980 1727204180.80782: variable 'ansible_distribution' from source: facts 15980 1727204180.80786: variable '__network_rh_distros' from source: role '' defaults 15980 1727204180.80794: variable 'ansible_distribution_major_version' from source: facts 15980 1727204180.80804: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15980 1727204180.80934: variable 'ansible_distribution' from source: facts 15980 1727204180.80938: variable '__network_rh_distros' from source: role '' defaults 15980 1727204180.80940: variable 'ansible_distribution_major_version' from source: facts 15980 1727204180.80943: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15980 1727204180.81068: variable 'ansible_distribution' from source: facts 15980 1727204180.81072: variable '__network_rh_distros' from source: role '' defaults 15980 1727204180.81076: variable 'ansible_distribution_major_version' from source: facts 15980 1727204180.81102: variable 'network_provider' from source: set_fact 15980 1727204180.81127: variable 'omit' from source: magic vars 15980 1727204180.81151: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204180.81178: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204180.81195: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204180.81209: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204180.81219: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204180.81247: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204180.81251: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204180.81255: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204180.81330: Set connection var ansible_connection to ssh 15980 1727204180.81335: Set connection var ansible_pipelining to False 15980 1727204180.81340: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204180.81347: Set connection var ansible_timeout to 10 15980 1727204180.81353: Set connection var ansible_shell_type to sh 15980 1727204180.81358: Set connection var ansible_shell_executable to /bin/sh 15980 1727204180.81386: variable 'ansible_shell_executable' from source: unknown 15980 1727204180.81389: variable 'ansible_connection' from source: unknown 15980 1727204180.81392: variable 'ansible_module_compression' from source: unknown 15980 1727204180.81395: variable 'ansible_shell_type' from source: unknown 15980 1727204180.81398: variable 'ansible_shell_executable' from source: unknown 15980 1727204180.81400: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204180.81407: variable 'ansible_pipelining' from source: unknown 15980 1727204180.81410: variable 'ansible_timeout' from source: unknown 15980 1727204180.81412: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204180.81495: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15980 1727204180.81505: variable 'omit' from source: magic vars 15980 1727204180.81511: starting attempt loop 15980 1727204180.81514: running the handler 15980 1727204180.81608: variable 'ansible_facts' from source: unknown 15980 1727204180.82528: _low_level_execute_command(): starting 15980 1727204180.82533: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15980 1727204180.83374: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204180.83379: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204180.83473: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204180.85242: stdout chunk (state=3): >>>/root <<< 15980 1727204180.85351: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204180.85415: stderr chunk (state=3): >>><<< 15980 1727204180.85419: stdout chunk (state=3): >>><<< 15980 1727204180.85446: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204180.85457: _low_level_execute_command(): starting 15980 1727204180.85463: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204180.854449-19074-157802359533204 `" && echo ansible-tmp-1727204180.854449-19074-157802359533204="` echo /root/.ansible/tmp/ansible-tmp-1727204180.854449-19074-157802359533204 `" ) && sleep 0' 15980 1727204180.86105: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204180.86110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204180.86123: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204180.86153: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204180.86256: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204180.88221: stdout chunk (state=3): >>>ansible-tmp-1727204180.854449-19074-157802359533204=/root/.ansible/tmp/ansible-tmp-1727204180.854449-19074-157802359533204 <<< 15980 1727204180.88337: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204180.88400: stderr chunk (state=3): >>><<< 15980 1727204180.88403: stdout chunk (state=3): >>><<< 15980 1727204180.88418: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204180.854449-19074-157802359533204=/root/.ansible/tmp/ansible-tmp-1727204180.854449-19074-157802359533204 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204180.88451: variable 'ansible_module_compression' from source: unknown 15980 1727204180.88499: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15980vtkmnsww/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 15980 1727204180.88555: variable 'ansible_facts' from source: unknown 15980 1727204180.88700: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204180.854449-19074-157802359533204/AnsiballZ_systemd.py 15980 1727204180.88822: Sending initial data 15980 1727204180.88825: Sent initial data (155 bytes) 15980 1727204180.89433: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204180.89446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204180.89450: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204180.89452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 15980 1727204180.89458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204180.89462: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204180.89480: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204180.89591: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204180.91253: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 15980 1727204180.91285: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15980 1727204180.91358: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15980 1727204180.91429: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15980vtkmnsww/tmpyhrgw95g /root/.ansible/tmp/ansible-tmp-1727204180.854449-19074-157802359533204/AnsiballZ_systemd.py <<< 15980 1727204180.91436: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204180.854449-19074-157802359533204/AnsiballZ_systemd.py" <<< 15980 1727204180.91513: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15980vtkmnsww/tmpyhrgw95g" to remote "/root/.ansible/tmp/ansible-tmp-1727204180.854449-19074-157802359533204/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204180.854449-19074-157802359533204/AnsiballZ_systemd.py" <<< 15980 1727204180.93444: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204180.93484: stderr chunk (state=3): >>><<< 15980 1727204180.93507: stdout chunk (state=3): >>><<< 15980 1727204180.93515: done transferring module to remote 15980 1727204180.93529: _low_level_execute_command(): starting 15980 1727204180.93535: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204180.854449-19074-157802359533204/ /root/.ansible/tmp/ansible-tmp-1727204180.854449-19074-157802359533204/AnsiballZ_systemd.py && sleep 0' 15980 1727204180.94124: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204180.94131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15980 1727204180.94160: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204180.94163: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204180.94169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 15980 1727204180.94171: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204180.94234: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204180.94243: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204180.94254: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204180.94350: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204180.96192: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204180.96252: stderr chunk (state=3): >>><<< 15980 1727204180.96256: stdout chunk (state=3): >>><<< 15980 1727204180.96273: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204180.96277: _low_level_execute_command(): starting 15980 1727204180.96280: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204180.854449-19074-157802359533204/AnsiballZ_systemd.py && sleep 0' 15980 1727204180.96797: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204180.96804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 15980 1727204180.96806: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204180.96810: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204180.96812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 15980 1727204180.96819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204180.96868: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204180.96871: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204180.96874: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204180.96960: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204181.28992: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3396", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ExecMainStartTimestampMonotonic": "260891109", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3396", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5305", "MemoryCurrent": "4472832", "MemoryPeak": "6385664", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3530145792", "CPUUsageNSec": "796763000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target multi-user.target network.service shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service dbus.socket sysinit.target basic.target cloud-init-local.service system.slice", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:55 EDT", "StateChangeTimestampMonotonic": "382184288", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveExitTimestampMonotonic": "260891359", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveEnterTimestampMonotonic": "260977925", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveExitTimestampMonotonic": "260855945", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveEnterTimestampMonotonic": "260882383", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ConditionTimestampMonotonic": "260884363", "AssertTimestamp": "Tue 2024-09-24 14:51:54 EDT", "AssertTimestampMonotonic": "260884375", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "16e01f988f204339aa5cb89910a771d1", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 15980 1727204181.30988: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 15980 1727204181.30993: stderr chunk (state=3): >>><<< 15980 1727204181.30996: stdout chunk (state=3): >>><<< 15980 1727204181.31176: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3396", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ExecMainStartTimestampMonotonic": "260891109", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3396", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5305", "MemoryCurrent": "4472832", "MemoryPeak": "6385664", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3530145792", "CPUUsageNSec": "796763000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target multi-user.target network.service shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service dbus.socket sysinit.target basic.target cloud-init-local.service system.slice", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:55 EDT", "StateChangeTimestampMonotonic": "382184288", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveExitTimestampMonotonic": "260891359", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveEnterTimestampMonotonic": "260977925", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveExitTimestampMonotonic": "260855945", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveEnterTimestampMonotonic": "260882383", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ConditionTimestampMonotonic": "260884363", "AssertTimestamp": "Tue 2024-09-24 14:51:54 EDT", "AssertTimestampMonotonic": "260884375", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "16e01f988f204339aa5cb89910a771d1", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 15980 1727204181.31233: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204180.854449-19074-157802359533204/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15980 1727204181.31253: _low_level_execute_command(): starting 15980 1727204181.31257: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204180.854449-19074-157802359533204/ > /dev/null 2>&1 && sleep 0' 15980 1727204181.31908: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204181.32143: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204181.32149: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204181.32152: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204181.32170: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204181.32280: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204181.34474: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204181.34478: stdout chunk (state=3): >>><<< 15980 1727204181.34481: stderr chunk (state=3): >>><<< 15980 1727204181.34483: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204181.34486: handler run complete 15980 1727204181.34488: attempt loop complete, returning result 15980 1727204181.34490: _execute() done 15980 1727204181.34492: dumping result to json 15980 1727204181.34494: done dumping result, returning 15980 1727204181.34496: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [127b8e07-fff9-5f1d-4b72-000000000066] 15980 1727204181.34498: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000066 ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15980 1727204181.34926: no more pending results, returning what we have 15980 1727204181.34930: results queue empty 15980 1727204181.34931: checking for any_errors_fatal 15980 1727204181.34945: done checking for any_errors_fatal 15980 1727204181.34946: checking for max_fail_percentage 15980 1727204181.34948: done checking for max_fail_percentage 15980 1727204181.34949: checking to see if all hosts have failed and the running result is not ok 15980 1727204181.34950: done checking to see if all hosts have failed 15980 1727204181.34951: getting the remaining hosts for this loop 15980 1727204181.34953: done getting the remaining hosts for this loop 15980 1727204181.34958: getting the next task for host managed-node2 15980 1727204181.34964: done getting next task for host managed-node2 15980 1727204181.34970: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15980 1727204181.34972: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204181.34984: getting variables 15980 1727204181.34986: in VariableManager get_vars() 15980 1727204181.35028: Calling all_inventory to load vars for managed-node2 15980 1727204181.35032: Calling groups_inventory to load vars for managed-node2 15980 1727204181.35187: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204181.35197: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000066 15980 1727204181.35200: WORKER PROCESS EXITING 15980 1727204181.35212: Calling all_plugins_play to load vars for managed-node2 15980 1727204181.35216: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204181.35219: Calling groups_plugins_play to load vars for managed-node2 15980 1727204181.37181: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204181.39414: done with get_vars() 15980 1727204181.39460: done getting variables 15980 1727204181.39539: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:56:21 -0400 (0:00:00.695) 0:00:42.806 ***** 15980 1727204181.39579: entering _queue_task() for managed-node2/service 15980 1727204181.40108: worker is 1 (out of 1 available) 15980 1727204181.40122: exiting _queue_task() for managed-node2/service 15980 1727204181.40134: done queuing things up, now waiting for results queue to drain 15980 1727204181.40137: waiting for pending results... 15980 1727204181.40373: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15980 1727204181.40586: in run() - task 127b8e07-fff9-5f1d-4b72-000000000067 15980 1727204181.40591: variable 'ansible_search_path' from source: unknown 15980 1727204181.40595: variable 'ansible_search_path' from source: unknown 15980 1727204181.40599: calling self._execute() 15980 1727204181.40720: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204181.40734: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204181.40747: variable 'omit' from source: magic vars 15980 1727204181.41182: variable 'ansible_distribution_major_version' from source: facts 15980 1727204181.41200: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204181.41334: variable 'network_provider' from source: set_fact 15980 1727204181.41351: Evaluated conditional (network_provider == "nm"): True 15980 1727204181.41461: variable '__network_wpa_supplicant_required' from source: role '' defaults 15980 1727204181.41672: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15980 1727204181.41761: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15980 1727204181.44657: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15980 1727204181.44751: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15980 1727204181.44800: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15980 1727204181.44851: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15980 1727204181.44890: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15980 1727204181.44996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204181.45034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204181.45079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204181.45129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204181.45155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204181.45221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204181.45251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204181.45371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204181.45377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204181.45380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204181.45414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204181.45443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204181.45481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204181.45533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204181.45592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204181.45741: variable 'network_connections' from source: play vars 15980 1727204181.45759: variable 'profile' from source: play vars 15980 1727204181.45849: variable 'profile' from source: play vars 15980 1727204181.45860: variable 'interface' from source: set_fact 15980 1727204181.45941: variable 'interface' from source: set_fact 15980 1727204181.46041: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15980 1727204181.53975: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15980 1727204181.53980: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15980 1727204181.54012: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15980 1727204181.54051: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15980 1727204181.54127: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15980 1727204181.54157: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15980 1727204181.54195: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204181.54241: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15980 1727204181.54293: variable '__network_wireless_connections_defined' from source: role '' defaults 15980 1727204181.54624: variable 'network_connections' from source: play vars 15980 1727204181.54657: variable 'profile' from source: play vars 15980 1727204181.54756: variable 'profile' from source: play vars 15980 1727204181.54761: variable 'interface' from source: set_fact 15980 1727204181.54833: variable 'interface' from source: set_fact 15980 1727204181.54889: Evaluated conditional (__network_wpa_supplicant_required): False 15980 1727204181.54898: when evaluation is False, skipping this task 15980 1727204181.54974: _execute() done 15980 1727204181.54989: dumping result to json 15980 1727204181.54992: done dumping result, returning 15980 1727204181.54995: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [127b8e07-fff9-5f1d-4b72-000000000067] 15980 1727204181.54998: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000067 15980 1727204181.55080: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000067 15980 1727204181.55084: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 15980 1727204181.55130: no more pending results, returning what we have 15980 1727204181.55134: results queue empty 15980 1727204181.55135: checking for any_errors_fatal 15980 1727204181.55150: done checking for any_errors_fatal 15980 1727204181.55151: checking for max_fail_percentage 15980 1727204181.55153: done checking for max_fail_percentage 15980 1727204181.55155: checking to see if all hosts have failed and the running result is not ok 15980 1727204181.55156: done checking to see if all hosts have failed 15980 1727204181.55156: getting the remaining hosts for this loop 15980 1727204181.55158: done getting the remaining hosts for this loop 15980 1727204181.55162: getting the next task for host managed-node2 15980 1727204181.55170: done getting next task for host managed-node2 15980 1727204181.55174: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 15980 1727204181.55176: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204181.55192: getting variables 15980 1727204181.55194: in VariableManager get_vars() 15980 1727204181.55237: Calling all_inventory to load vars for managed-node2 15980 1727204181.55240: Calling groups_inventory to load vars for managed-node2 15980 1727204181.55243: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204181.55255: Calling all_plugins_play to load vars for managed-node2 15980 1727204181.55258: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204181.55261: Calling groups_plugins_play to load vars for managed-node2 15980 1727204181.64474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204181.66769: done with get_vars() 15980 1727204181.66814: done getting variables 15980 1727204181.66873: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:56:21 -0400 (0:00:00.273) 0:00:43.079 ***** 15980 1727204181.66899: entering _queue_task() for managed-node2/service 15980 1727204181.67323: worker is 1 (out of 1 available) 15980 1727204181.67340: exiting _queue_task() for managed-node2/service 15980 1727204181.67353: done queuing things up, now waiting for results queue to drain 15980 1727204181.67355: waiting for pending results... 15980 1727204181.67862: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 15980 1727204181.67871: in run() - task 127b8e07-fff9-5f1d-4b72-000000000068 15980 1727204181.67875: variable 'ansible_search_path' from source: unknown 15980 1727204181.67879: variable 'ansible_search_path' from source: unknown 15980 1727204181.67959: calling self._execute() 15980 1727204181.68041: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204181.68056: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204181.68083: variable 'omit' from source: magic vars 15980 1727204181.68550: variable 'ansible_distribution_major_version' from source: facts 15980 1727204181.68573: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204181.68722: variable 'network_provider' from source: set_fact 15980 1727204181.68831: Evaluated conditional (network_provider == "initscripts"): False 15980 1727204181.68835: when evaluation is False, skipping this task 15980 1727204181.68838: _execute() done 15980 1727204181.68842: dumping result to json 15980 1727204181.68844: done dumping result, returning 15980 1727204181.68847: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [127b8e07-fff9-5f1d-4b72-000000000068] 15980 1727204181.68849: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000068 15980 1727204181.69055: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000068 15980 1727204181.69060: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15980 1727204181.69116: no more pending results, returning what we have 15980 1727204181.69120: results queue empty 15980 1727204181.69121: checking for any_errors_fatal 15980 1727204181.69134: done checking for any_errors_fatal 15980 1727204181.69135: checking for max_fail_percentage 15980 1727204181.69139: done checking for max_fail_percentage 15980 1727204181.69140: checking to see if all hosts have failed and the running result is not ok 15980 1727204181.69141: done checking to see if all hosts have failed 15980 1727204181.69142: getting the remaining hosts for this loop 15980 1727204181.69145: done getting the remaining hosts for this loop 15980 1727204181.69149: getting the next task for host managed-node2 15980 1727204181.69156: done getting next task for host managed-node2 15980 1727204181.69162: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15980 1727204181.69169: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204181.69190: getting variables 15980 1727204181.69192: in VariableManager get_vars() 15980 1727204181.69242: Calling all_inventory to load vars for managed-node2 15980 1727204181.69246: Calling groups_inventory to load vars for managed-node2 15980 1727204181.69248: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204181.69262: Calling all_plugins_play to load vars for managed-node2 15980 1727204181.69472: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204181.69479: Calling groups_plugins_play to load vars for managed-node2 15980 1727204181.71208: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204181.73550: done with get_vars() 15980 1727204181.73585: done getting variables 15980 1727204181.73663: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:56:21 -0400 (0:00:00.067) 0:00:43.147 ***** 15980 1727204181.73709: entering _queue_task() for managed-node2/copy 15980 1727204181.74120: worker is 1 (out of 1 available) 15980 1727204181.74136: exiting _queue_task() for managed-node2/copy 15980 1727204181.74154: done queuing things up, now waiting for results queue to drain 15980 1727204181.74157: waiting for pending results... 15980 1727204181.74602: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15980 1727204181.74620: in run() - task 127b8e07-fff9-5f1d-4b72-000000000069 15980 1727204181.74640: variable 'ansible_search_path' from source: unknown 15980 1727204181.74644: variable 'ansible_search_path' from source: unknown 15980 1727204181.74801: calling self._execute() 15980 1727204181.74818: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204181.74826: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204181.74842: variable 'omit' from source: magic vars 15980 1727204181.75293: variable 'ansible_distribution_major_version' from source: facts 15980 1727204181.75305: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204181.75439: variable 'network_provider' from source: set_fact 15980 1727204181.75445: Evaluated conditional (network_provider == "initscripts"): False 15980 1727204181.75453: when evaluation is False, skipping this task 15980 1727204181.75457: _execute() done 15980 1727204181.75667: dumping result to json 15980 1727204181.75673: done dumping result, returning 15980 1727204181.75677: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [127b8e07-fff9-5f1d-4b72-000000000069] 15980 1727204181.75680: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000069 15980 1727204181.75754: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000069 15980 1727204181.75757: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 15980 1727204181.75801: no more pending results, returning what we have 15980 1727204181.75804: results queue empty 15980 1727204181.75805: checking for any_errors_fatal 15980 1727204181.75810: done checking for any_errors_fatal 15980 1727204181.75811: checking for max_fail_percentage 15980 1727204181.75813: done checking for max_fail_percentage 15980 1727204181.75813: checking to see if all hosts have failed and the running result is not ok 15980 1727204181.75814: done checking to see if all hosts have failed 15980 1727204181.75815: getting the remaining hosts for this loop 15980 1727204181.75816: done getting the remaining hosts for this loop 15980 1727204181.75820: getting the next task for host managed-node2 15980 1727204181.75825: done getting next task for host managed-node2 15980 1727204181.75829: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15980 1727204181.75832: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204181.75845: getting variables 15980 1727204181.75847: in VariableManager get_vars() 15980 1727204181.75886: Calling all_inventory to load vars for managed-node2 15980 1727204181.75890: Calling groups_inventory to load vars for managed-node2 15980 1727204181.75892: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204181.75903: Calling all_plugins_play to load vars for managed-node2 15980 1727204181.75905: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204181.75908: Calling groups_plugins_play to load vars for managed-node2 15980 1727204181.77635: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204181.79797: done with get_vars() 15980 1727204181.79832: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:56:21 -0400 (0:00:00.062) 0:00:43.209 ***** 15980 1727204181.79936: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 15980 1727204181.80595: worker is 1 (out of 1 available) 15980 1727204181.80607: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 15980 1727204181.80619: done queuing things up, now waiting for results queue to drain 15980 1727204181.80621: waiting for pending results... 15980 1727204181.80867: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15980 1727204181.80892: in run() - task 127b8e07-fff9-5f1d-4b72-00000000006a 15980 1727204181.80918: variable 'ansible_search_path' from source: unknown 15980 1727204181.80926: variable 'ansible_search_path' from source: unknown 15980 1727204181.80982: calling self._execute() 15980 1727204181.81113: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204181.81125: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204181.81140: variable 'omit' from source: magic vars 15980 1727204181.81612: variable 'ansible_distribution_major_version' from source: facts 15980 1727204181.81638: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204181.81651: variable 'omit' from source: magic vars 15980 1727204181.81720: variable 'omit' from source: magic vars 15980 1727204181.81916: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15980 1727204181.84448: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15980 1727204181.84986: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15980 1727204181.85010: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15980 1727204181.85052: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15980 1727204181.85101: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15980 1727204181.85182: variable 'network_provider' from source: set_fact 15980 1727204181.85431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15980 1727204181.85436: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15980 1727204181.85439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15980 1727204181.85470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15980 1727204181.85539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15980 1727204181.85580: variable 'omit' from source: magic vars 15980 1727204181.85721: variable 'omit' from source: magic vars 15980 1727204181.85847: variable 'network_connections' from source: play vars 15980 1727204181.85873: variable 'profile' from source: play vars 15980 1727204181.85944: variable 'profile' from source: play vars 15980 1727204181.85955: variable 'interface' from source: set_fact 15980 1727204181.86032: variable 'interface' from source: set_fact 15980 1727204181.86198: variable 'omit' from source: magic vars 15980 1727204181.86272: variable '__lsr_ansible_managed' from source: task vars 15980 1727204181.86280: variable '__lsr_ansible_managed' from source: task vars 15980 1727204181.86497: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 15980 1727204181.86763: Loaded config def from plugin (lookup/template) 15980 1727204181.86778: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 15980 1727204181.86823: File lookup term: get_ansible_managed.j2 15980 1727204181.86826: variable 'ansible_search_path' from source: unknown 15980 1727204181.86872: evaluation_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 15980 1727204181.86876: search_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 15980 1727204181.86886: variable 'ansible_search_path' from source: unknown 15980 1727204181.94815: variable 'ansible_managed' from source: unknown 15980 1727204181.95131: variable 'omit' from source: magic vars 15980 1727204181.95135: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204181.95138: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204181.95140: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204181.95147: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204181.95161: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204181.95195: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204181.95203: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204181.95211: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204181.95315: Set connection var ansible_connection to ssh 15980 1727204181.95327: Set connection var ansible_pipelining to False 15980 1727204181.95342: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204181.95356: Set connection var ansible_timeout to 10 15980 1727204181.95369: Set connection var ansible_shell_type to sh 15980 1727204181.95380: Set connection var ansible_shell_executable to /bin/sh 15980 1727204181.95411: variable 'ansible_shell_executable' from source: unknown 15980 1727204181.95418: variable 'ansible_connection' from source: unknown 15980 1727204181.95424: variable 'ansible_module_compression' from source: unknown 15980 1727204181.95431: variable 'ansible_shell_type' from source: unknown 15980 1727204181.95437: variable 'ansible_shell_executable' from source: unknown 15980 1727204181.95443: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204181.95455: variable 'ansible_pipelining' from source: unknown 15980 1727204181.95468: variable 'ansible_timeout' from source: unknown 15980 1727204181.95561: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204181.95627: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15980 1727204181.95653: variable 'omit' from source: magic vars 15980 1727204181.95670: starting attempt loop 15980 1727204181.95680: running the handler 15980 1727204181.95696: _low_level_execute_command(): starting 15980 1727204181.95705: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15980 1727204181.96485: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204181.96568: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204181.96620: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204181.96657: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204181.96673: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204181.96792: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204181.98573: stdout chunk (state=3): >>>/root <<< 15980 1727204181.98778: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204181.98782: stdout chunk (state=3): >>><<< 15980 1727204181.98785: stderr chunk (state=3): >>><<< 15980 1727204181.98809: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204181.98831: _low_level_execute_command(): starting 15980 1727204181.98926: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204181.9881728-19114-270537432389576 `" && echo ansible-tmp-1727204181.9881728-19114-270537432389576="` echo /root/.ansible/tmp/ansible-tmp-1727204181.9881728-19114-270537432389576 `" ) && sleep 0' 15980 1727204181.99582: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204181.99586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 15980 1727204181.99588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 15980 1727204181.99591: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15980 1727204181.99594: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204181.99659: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204181.99663: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204181.99692: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204181.99803: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204182.01806: stdout chunk (state=3): >>>ansible-tmp-1727204181.9881728-19114-270537432389576=/root/.ansible/tmp/ansible-tmp-1727204181.9881728-19114-270537432389576 <<< 15980 1727204182.02012: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204182.02016: stderr chunk (state=3): >>><<< 15980 1727204182.02018: stdout chunk (state=3): >>><<< 15980 1727204182.02120: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204181.9881728-19114-270537432389576=/root/.ansible/tmp/ansible-tmp-1727204181.9881728-19114-270537432389576 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204182.02125: variable 'ansible_module_compression' from source: unknown 15980 1727204182.02140: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15980vtkmnsww/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 15980 1727204182.02176: variable 'ansible_facts' from source: unknown 15980 1727204182.02256: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204181.9881728-19114-270537432389576/AnsiballZ_network_connections.py 15980 1727204182.02517: Sending initial data 15980 1727204182.02521: Sent initial data (168 bytes) 15980 1727204182.03195: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204182.03273: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204182.03301: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204182.03419: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204182.05034: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15980 1727204182.05114: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15980 1727204182.05197: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15980vtkmnsww/tmpuqsc96nz /root/.ansible/tmp/ansible-tmp-1727204181.9881728-19114-270537432389576/AnsiballZ_network_connections.py <<< 15980 1727204182.05201: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204181.9881728-19114-270537432389576/AnsiballZ_network_connections.py" <<< 15980 1727204182.05252: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15980vtkmnsww/tmpuqsc96nz" to remote "/root/.ansible/tmp/ansible-tmp-1727204181.9881728-19114-270537432389576/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204181.9881728-19114-270537432389576/AnsiballZ_network_connections.py" <<< 15980 1727204182.06697: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204182.06804: stderr chunk (state=3): >>><<< 15980 1727204182.06809: stdout chunk (state=3): >>><<< 15980 1727204182.06812: done transferring module to remote 15980 1727204182.06814: _low_level_execute_command(): starting 15980 1727204182.06816: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204181.9881728-19114-270537432389576/ /root/.ansible/tmp/ansible-tmp-1727204181.9881728-19114-270537432389576/AnsiballZ_network_connections.py && sleep 0' 15980 1727204182.07478: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204182.07585: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204182.07615: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204182.07637: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204182.07670: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204182.07798: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204182.09690: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204182.09715: stdout chunk (state=3): >>><<< 15980 1727204182.09718: stderr chunk (state=3): >>><<< 15980 1727204182.09739: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204182.09831: _low_level_execute_command(): starting 15980 1727204182.09835: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204181.9881728-19114-270537432389576/AnsiballZ_network_connections.py && sleep 0' 15980 1727204182.10491: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204182.10537: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204182.10541: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204182.10568: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204182.10684: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204182.39864: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_e_2fmdxg/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_e_2fmdxg/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on LSR-TST-br31/17a7d1a5-4da5-45e1-8ef4-6d7b416254ea: error=unknown <<< 15980 1727204182.40046: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 15980 1727204182.41988: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204182.42346: stderr chunk (state=3): >>>Shared connection to 10.31.47.73 closed. <<< 15980 1727204182.42350: stdout chunk (state=3): >>><<< 15980 1727204182.42353: stderr chunk (state=3): >>><<< 15980 1727204182.42356: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_e_2fmdxg/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_e_2fmdxg/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on LSR-TST-br31/17a7d1a5-4da5-45e1-8ef4-6d7b416254ea: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 15980 1727204182.42359: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'LSR-TST-br31', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204181.9881728-19114-270537432389576/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15980 1727204182.42361: _low_level_execute_command(): starting 15980 1727204182.42364: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204181.9881728-19114-270537432389576/ > /dev/null 2>&1 && sleep 0' 15980 1727204182.42980: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204182.42995: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204182.43011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204182.43032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15980 1727204182.43127: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204182.43150: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204182.43173: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204182.43195: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204182.43342: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204182.45386: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204182.45406: stdout chunk (state=3): >>><<< 15980 1727204182.45423: stderr chunk (state=3): >>><<< 15980 1727204182.45572: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204182.45576: handler run complete 15980 1727204182.45578: attempt loop complete, returning result 15980 1727204182.45581: _execute() done 15980 1727204182.45583: dumping result to json 15980 1727204182.45585: done dumping result, returning 15980 1727204182.45591: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [127b8e07-fff9-5f1d-4b72-00000000006a] 15980 1727204182.45594: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000006a 15980 1727204182.45683: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000006a 15980 1727204182.45686: WORKER PROCESS EXITING changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "LSR-TST-br31", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 15980 1727204182.45800: no more pending results, returning what we have 15980 1727204182.45804: results queue empty 15980 1727204182.45805: checking for any_errors_fatal 15980 1727204182.45812: done checking for any_errors_fatal 15980 1727204182.45813: checking for max_fail_percentage 15980 1727204182.45815: done checking for max_fail_percentage 15980 1727204182.45816: checking to see if all hosts have failed and the running result is not ok 15980 1727204182.45817: done checking to see if all hosts have failed 15980 1727204182.45818: getting the remaining hosts for this loop 15980 1727204182.45820: done getting the remaining hosts for this loop 15980 1727204182.45824: getting the next task for host managed-node2 15980 1727204182.45834: done getting next task for host managed-node2 15980 1727204182.45839: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 15980 1727204182.45841: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204182.45853: getting variables 15980 1727204182.45855: in VariableManager get_vars() 15980 1727204182.46018: Calling all_inventory to load vars for managed-node2 15980 1727204182.46022: Calling groups_inventory to load vars for managed-node2 15980 1727204182.46027: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204182.46040: Calling all_plugins_play to load vars for managed-node2 15980 1727204182.46043: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204182.46047: Calling groups_plugins_play to load vars for managed-node2 15980 1727204182.48481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204182.50612: done with get_vars() 15980 1727204182.50650: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:56:22 -0400 (0:00:00.708) 0:00:43.917 ***** 15980 1727204182.50743: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 15980 1727204182.51113: worker is 1 (out of 1 available) 15980 1727204182.51128: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 15980 1727204182.51140: done queuing things up, now waiting for results queue to drain 15980 1727204182.51142: waiting for pending results... 15980 1727204182.51691: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 15980 1727204182.51698: in run() - task 127b8e07-fff9-5f1d-4b72-00000000006b 15980 1727204182.51703: variable 'ansible_search_path' from source: unknown 15980 1727204182.51707: variable 'ansible_search_path' from source: unknown 15980 1727204182.51717: calling self._execute() 15980 1727204182.51781: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204182.51785: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204182.51805: variable 'omit' from source: magic vars 15980 1727204182.52274: variable 'ansible_distribution_major_version' from source: facts 15980 1727204182.52278: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204182.52385: variable 'network_state' from source: role '' defaults 15980 1727204182.52397: Evaluated conditional (network_state != {}): False 15980 1727204182.52401: when evaluation is False, skipping this task 15980 1727204182.52404: _execute() done 15980 1727204182.52408: dumping result to json 15980 1727204182.52410: done dumping result, returning 15980 1727204182.52419: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [127b8e07-fff9-5f1d-4b72-00000000006b] 15980 1727204182.52428: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000006b 15980 1727204182.52555: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000006b 15980 1727204182.52558: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15980 1727204182.52625: no more pending results, returning what we have 15980 1727204182.52630: results queue empty 15980 1727204182.52631: checking for any_errors_fatal 15980 1727204182.52645: done checking for any_errors_fatal 15980 1727204182.52646: checking for max_fail_percentage 15980 1727204182.52649: done checking for max_fail_percentage 15980 1727204182.52650: checking to see if all hosts have failed and the running result is not ok 15980 1727204182.52651: done checking to see if all hosts have failed 15980 1727204182.52652: getting the remaining hosts for this loop 15980 1727204182.52653: done getting the remaining hosts for this loop 15980 1727204182.52658: getting the next task for host managed-node2 15980 1727204182.52667: done getting next task for host managed-node2 15980 1727204182.52672: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15980 1727204182.52675: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204182.52693: getting variables 15980 1727204182.52695: in VariableManager get_vars() 15980 1727204182.52742: Calling all_inventory to load vars for managed-node2 15980 1727204182.52745: Calling groups_inventory to load vars for managed-node2 15980 1727204182.52747: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204182.52763: Calling all_plugins_play to load vars for managed-node2 15980 1727204182.52769: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204182.52773: Calling groups_plugins_play to load vars for managed-node2 15980 1727204182.54684: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204182.57003: done with get_vars() 15980 1727204182.57035: done getting variables 15980 1727204182.57106: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:56:22 -0400 (0:00:00.063) 0:00:43.981 ***** 15980 1727204182.57141: entering _queue_task() for managed-node2/debug 15980 1727204182.57522: worker is 1 (out of 1 available) 15980 1727204182.57536: exiting _queue_task() for managed-node2/debug 15980 1727204182.57549: done queuing things up, now waiting for results queue to drain 15980 1727204182.57551: waiting for pending results... 15980 1727204182.57992: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15980 1727204182.58000: in run() - task 127b8e07-fff9-5f1d-4b72-00000000006c 15980 1727204182.58017: variable 'ansible_search_path' from source: unknown 15980 1727204182.58022: variable 'ansible_search_path' from source: unknown 15980 1727204182.58072: calling self._execute() 15980 1727204182.58171: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204182.58204: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204182.58212: variable 'omit' from source: magic vars 15980 1727204182.58614: variable 'ansible_distribution_major_version' from source: facts 15980 1727204182.58630: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204182.58633: variable 'omit' from source: magic vars 15980 1727204182.58685: variable 'omit' from source: magic vars 15980 1727204182.58728: variable 'omit' from source: magic vars 15980 1727204182.58971: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204182.58976: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204182.58979: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204182.58982: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204182.58984: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204182.58987: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204182.58990: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204182.58992: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204182.59036: Set connection var ansible_connection to ssh 15980 1727204182.59045: Set connection var ansible_pipelining to False 15980 1727204182.59052: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204182.59059: Set connection var ansible_timeout to 10 15980 1727204182.59065: Set connection var ansible_shell_type to sh 15980 1727204182.59073: Set connection var ansible_shell_executable to /bin/sh 15980 1727204182.59110: variable 'ansible_shell_executable' from source: unknown 15980 1727204182.59117: variable 'ansible_connection' from source: unknown 15980 1727204182.59121: variable 'ansible_module_compression' from source: unknown 15980 1727204182.59124: variable 'ansible_shell_type' from source: unknown 15980 1727204182.59129: variable 'ansible_shell_executable' from source: unknown 15980 1727204182.59132: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204182.59134: variable 'ansible_pipelining' from source: unknown 15980 1727204182.59137: variable 'ansible_timeout' from source: unknown 15980 1727204182.59139: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204182.59335: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15980 1727204182.59340: variable 'omit' from source: magic vars 15980 1727204182.59343: starting attempt loop 15980 1727204182.59345: running the handler 15980 1727204182.59483: variable '__network_connections_result' from source: set_fact 15980 1727204182.59560: handler run complete 15980 1727204182.59655: attempt loop complete, returning result 15980 1727204182.59659: _execute() done 15980 1727204182.59662: dumping result to json 15980 1727204182.59666: done dumping result, returning 15980 1727204182.59678: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [127b8e07-fff9-5f1d-4b72-00000000006c] 15980 1727204182.59681: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000006c 15980 1727204182.59791: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000006c 15980 1727204182.59796: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "" ] } 15980 1727204182.59872: no more pending results, returning what we have 15980 1727204182.59876: results queue empty 15980 1727204182.59877: checking for any_errors_fatal 15980 1727204182.59884: done checking for any_errors_fatal 15980 1727204182.59885: checking for max_fail_percentage 15980 1727204182.59887: done checking for max_fail_percentage 15980 1727204182.59888: checking to see if all hosts have failed and the running result is not ok 15980 1727204182.59889: done checking to see if all hosts have failed 15980 1727204182.59890: getting the remaining hosts for this loop 15980 1727204182.59892: done getting the remaining hosts for this loop 15980 1727204182.59898: getting the next task for host managed-node2 15980 1727204182.59905: done getting next task for host managed-node2 15980 1727204182.59910: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15980 1727204182.59912: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204182.59923: getting variables 15980 1727204182.59925: in VariableManager get_vars() 15980 1727204182.60167: Calling all_inventory to load vars for managed-node2 15980 1727204182.60171: Calling groups_inventory to load vars for managed-node2 15980 1727204182.60174: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204182.60184: Calling all_plugins_play to load vars for managed-node2 15980 1727204182.60187: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204182.60191: Calling groups_plugins_play to load vars for managed-node2 15980 1727204182.61824: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204182.65505: done with get_vars() 15980 1727204182.65554: done getting variables 15980 1727204182.65626: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:56:22 -0400 (0:00:00.085) 0:00:44.066 ***** 15980 1727204182.65660: entering _queue_task() for managed-node2/debug 15980 1727204182.66679: worker is 1 (out of 1 available) 15980 1727204182.66692: exiting _queue_task() for managed-node2/debug 15980 1727204182.66704: done queuing things up, now waiting for results queue to drain 15980 1727204182.66707: waiting for pending results... 15980 1727204182.67230: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15980 1727204182.68078: in run() - task 127b8e07-fff9-5f1d-4b72-00000000006d 15980 1727204182.68083: variable 'ansible_search_path' from source: unknown 15980 1727204182.68086: variable 'ansible_search_path' from source: unknown 15980 1727204182.68089: calling self._execute() 15980 1727204182.68323: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204182.68601: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204182.68606: variable 'omit' from source: magic vars 15980 1727204182.69521: variable 'ansible_distribution_major_version' from source: facts 15980 1727204182.69533: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204182.69541: variable 'omit' from source: magic vars 15980 1727204182.69608: variable 'omit' from source: magic vars 15980 1727204182.69652: variable 'omit' from source: magic vars 15980 1727204182.69697: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204182.69741: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204182.69767: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204182.69788: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204182.69800: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204182.69836: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204182.69840: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204182.69843: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204182.70035: Set connection var ansible_connection to ssh 15980 1727204182.70039: Set connection var ansible_pipelining to False 15980 1727204182.70042: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204182.70045: Set connection var ansible_timeout to 10 15980 1727204182.70047: Set connection var ansible_shell_type to sh 15980 1727204182.70049: Set connection var ansible_shell_executable to /bin/sh 15980 1727204182.70052: variable 'ansible_shell_executable' from source: unknown 15980 1727204182.70054: variable 'ansible_connection' from source: unknown 15980 1727204182.70057: variable 'ansible_module_compression' from source: unknown 15980 1727204182.70059: variable 'ansible_shell_type' from source: unknown 15980 1727204182.70061: variable 'ansible_shell_executable' from source: unknown 15980 1727204182.70063: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204182.70067: variable 'ansible_pipelining' from source: unknown 15980 1727204182.70069: variable 'ansible_timeout' from source: unknown 15980 1727204182.70071: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204182.70210: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15980 1727204182.70223: variable 'omit' from source: magic vars 15980 1727204182.70231: starting attempt loop 15980 1727204182.70234: running the handler 15980 1727204182.70360: variable '__network_connections_result' from source: set_fact 15980 1727204182.70390: variable '__network_connections_result' from source: set_fact 15980 1727204182.70508: handler run complete 15980 1727204182.70536: attempt loop complete, returning result 15980 1727204182.70539: _execute() done 15980 1727204182.70542: dumping result to json 15980 1727204182.70545: done dumping result, returning 15980 1727204182.70556: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [127b8e07-fff9-5f1d-4b72-00000000006d] 15980 1727204182.70560: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000006d 15980 1727204182.70798: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000006d 15980 1727204182.70802: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "LSR-TST-br31", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 15980 1727204182.70898: no more pending results, returning what we have 15980 1727204182.70901: results queue empty 15980 1727204182.70902: checking for any_errors_fatal 15980 1727204182.70910: done checking for any_errors_fatal 15980 1727204182.70911: checking for max_fail_percentage 15980 1727204182.70913: done checking for max_fail_percentage 15980 1727204182.70914: checking to see if all hosts have failed and the running result is not ok 15980 1727204182.70915: done checking to see if all hosts have failed 15980 1727204182.70916: getting the remaining hosts for this loop 15980 1727204182.70918: done getting the remaining hosts for this loop 15980 1727204182.70922: getting the next task for host managed-node2 15980 1727204182.70928: done getting next task for host managed-node2 15980 1727204182.70932: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15980 1727204182.70935: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204182.70945: getting variables 15980 1727204182.70947: in VariableManager get_vars() 15980 1727204182.71135: Calling all_inventory to load vars for managed-node2 15980 1727204182.71139: Calling groups_inventory to load vars for managed-node2 15980 1727204182.71141: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204182.71152: Calling all_plugins_play to load vars for managed-node2 15980 1727204182.71155: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204182.71158: Calling groups_plugins_play to load vars for managed-node2 15980 1727204182.73098: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204182.75664: done with get_vars() 15980 1727204182.75712: done getting variables 15980 1727204182.75806: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:56:22 -0400 (0:00:00.101) 0:00:44.168 ***** 15980 1727204182.75844: entering _queue_task() for managed-node2/debug 15980 1727204182.76779: worker is 1 (out of 1 available) 15980 1727204182.76920: exiting _queue_task() for managed-node2/debug 15980 1727204182.76934: done queuing things up, now waiting for results queue to drain 15980 1727204182.76937: waiting for pending results... 15980 1727204182.77141: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15980 1727204182.77260: in run() - task 127b8e07-fff9-5f1d-4b72-00000000006e 15980 1727204182.77279: variable 'ansible_search_path' from source: unknown 15980 1727204182.77285: variable 'ansible_search_path' from source: unknown 15980 1727204182.77351: calling self._execute() 15980 1727204182.77598: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204182.77602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204182.77606: variable 'omit' from source: magic vars 15980 1727204182.77975: variable 'ansible_distribution_major_version' from source: facts 15980 1727204182.77979: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204182.78150: variable 'network_state' from source: role '' defaults 15980 1727204182.78154: Evaluated conditional (network_state != {}): False 15980 1727204182.78157: when evaluation is False, skipping this task 15980 1727204182.78160: _execute() done 15980 1727204182.78163: dumping result to json 15980 1727204182.78473: done dumping result, returning 15980 1727204182.78571: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [127b8e07-fff9-5f1d-4b72-00000000006e] 15980 1727204182.78575: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000006e 15980 1727204182.78653: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000006e 15980 1727204182.78657: WORKER PROCESS EXITING skipping: [managed-node2] => { "false_condition": "network_state != {}" } 15980 1727204182.78700: no more pending results, returning what we have 15980 1727204182.78704: results queue empty 15980 1727204182.78705: checking for any_errors_fatal 15980 1727204182.78712: done checking for any_errors_fatal 15980 1727204182.78713: checking for max_fail_percentage 15980 1727204182.78714: done checking for max_fail_percentage 15980 1727204182.78715: checking to see if all hosts have failed and the running result is not ok 15980 1727204182.78716: done checking to see if all hosts have failed 15980 1727204182.78717: getting the remaining hosts for this loop 15980 1727204182.78718: done getting the remaining hosts for this loop 15980 1727204182.78722: getting the next task for host managed-node2 15980 1727204182.78729: done getting next task for host managed-node2 15980 1727204182.78734: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 15980 1727204182.78736: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204182.78750: getting variables 15980 1727204182.78752: in VariableManager get_vars() 15980 1727204182.78792: Calling all_inventory to load vars for managed-node2 15980 1727204182.78795: Calling groups_inventory to load vars for managed-node2 15980 1727204182.78797: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204182.78809: Calling all_plugins_play to load vars for managed-node2 15980 1727204182.78812: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204182.78815: Calling groups_plugins_play to load vars for managed-node2 15980 1727204182.81341: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204182.84048: done with get_vars() 15980 1727204182.84085: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:56:22 -0400 (0:00:00.085) 0:00:44.254 ***** 15980 1727204182.84396: entering _queue_task() for managed-node2/ping 15980 1727204182.85214: worker is 1 (out of 1 available) 15980 1727204182.85232: exiting _queue_task() for managed-node2/ping 15980 1727204182.85246: done queuing things up, now waiting for results queue to drain 15980 1727204182.85248: waiting for pending results... 15980 1727204182.85816: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 15980 1727204182.85847: in run() - task 127b8e07-fff9-5f1d-4b72-00000000006f 15980 1727204182.85962: variable 'ansible_search_path' from source: unknown 15980 1727204182.85968: variable 'ansible_search_path' from source: unknown 15980 1727204182.86025: calling self._execute() 15980 1727204182.86475: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204182.86479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204182.86482: variable 'omit' from source: magic vars 15980 1727204182.87162: variable 'ansible_distribution_major_version' from source: facts 15980 1727204182.87283: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204182.87286: variable 'omit' from source: magic vars 15980 1727204182.87469: variable 'omit' from source: magic vars 15980 1727204182.87634: variable 'omit' from source: magic vars 15980 1727204182.87680: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204182.87721: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204182.87745: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204182.87771: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204182.88171: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204182.88176: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204182.88179: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204182.88183: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204182.88186: Set connection var ansible_connection to ssh 15980 1727204182.88189: Set connection var ansible_pipelining to False 15980 1727204182.88191: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204182.88192: Set connection var ansible_timeout to 10 15980 1727204182.88195: Set connection var ansible_shell_type to sh 15980 1727204182.88197: Set connection var ansible_shell_executable to /bin/sh 15980 1727204182.88199: variable 'ansible_shell_executable' from source: unknown 15980 1727204182.88201: variable 'ansible_connection' from source: unknown 15980 1727204182.88204: variable 'ansible_module_compression' from source: unknown 15980 1727204182.88206: variable 'ansible_shell_type' from source: unknown 15980 1727204182.88208: variable 'ansible_shell_executable' from source: unknown 15980 1727204182.88210: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204182.88212: variable 'ansible_pipelining' from source: unknown 15980 1727204182.88215: variable 'ansible_timeout' from source: unknown 15980 1727204182.88218: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204182.88572: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15980 1727204182.88577: variable 'omit' from source: magic vars 15980 1727204182.88580: starting attempt loop 15980 1727204182.88582: running the handler 15980 1727204182.88585: _low_level_execute_command(): starting 15980 1727204182.88587: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15980 1727204182.89544: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 15980 1727204182.89557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204182.89696: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204182.89839: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204182.91596: stdout chunk (state=3): >>>/root <<< 15980 1727204182.92079: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204182.92084: stdout chunk (state=3): >>><<< 15980 1727204182.92087: stderr chunk (state=3): >>><<< 15980 1727204182.92092: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204182.92095: _low_level_execute_command(): starting 15980 1727204182.92097: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204182.9201913-19151-218366641209555 `" && echo ansible-tmp-1727204182.9201913-19151-218366641209555="` echo /root/.ansible/tmp/ansible-tmp-1727204182.9201913-19151-218366641209555 `" ) && sleep 0' 15980 1727204182.93178: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204182.93248: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204182.93381: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204182.95392: stdout chunk (state=3): >>>ansible-tmp-1727204182.9201913-19151-218366641209555=/root/.ansible/tmp/ansible-tmp-1727204182.9201913-19151-218366641209555 <<< 15980 1727204182.95525: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204182.95628: stderr chunk (state=3): >>><<< 15980 1727204182.95640: stdout chunk (state=3): >>><<< 15980 1727204182.95687: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204182.9201913-19151-218366641209555=/root/.ansible/tmp/ansible-tmp-1727204182.9201913-19151-218366641209555 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204182.95746: variable 'ansible_module_compression' from source: unknown 15980 1727204182.95928: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15980vtkmnsww/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 15980 1727204182.96136: variable 'ansible_facts' from source: unknown 15980 1727204182.96170: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204182.9201913-19151-218366641209555/AnsiballZ_ping.py 15980 1727204182.96604: Sending initial data 15980 1727204182.96618: Sent initial data (153 bytes) 15980 1727204182.97439: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204182.97458: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204182.97478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204182.97585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204182.97611: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204182.97628: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204182.97806: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204182.99407: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15980 1727204182.99473: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15980 1727204182.99533: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15980vtkmnsww/tmpsz8dt_qr /root/.ansible/tmp/ansible-tmp-1727204182.9201913-19151-218366641209555/AnsiballZ_ping.py <<< 15980 1727204182.99542: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204182.9201913-19151-218366641209555/AnsiballZ_ping.py" <<< 15980 1727204182.99602: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15980vtkmnsww/tmpsz8dt_qr" to remote "/root/.ansible/tmp/ansible-tmp-1727204182.9201913-19151-218366641209555/AnsiballZ_ping.py" <<< 15980 1727204182.99610: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204182.9201913-19151-218366641209555/AnsiballZ_ping.py" <<< 15980 1727204183.00772: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204183.00781: stderr chunk (state=3): >>><<< 15980 1727204183.00784: stdout chunk (state=3): >>><<< 15980 1727204183.00814: done transferring module to remote 15980 1727204183.00829: _low_level_execute_command(): starting 15980 1727204183.00832: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204182.9201913-19151-218366641209555/ /root/.ansible/tmp/ansible-tmp-1727204182.9201913-19151-218366641209555/AnsiballZ_ping.py && sleep 0' 15980 1727204183.01487: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204183.01491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 15980 1727204183.01518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204183.01521: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204183.01524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204183.01580: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204183.01585: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204183.01672: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204183.03606: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204183.03611: stdout chunk (state=3): >>><<< 15980 1727204183.03614: stderr chunk (state=3): >>><<< 15980 1727204183.03636: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204183.03733: _low_level_execute_command(): starting 15980 1727204183.03736: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204182.9201913-19151-218366641209555/AnsiballZ_ping.py && sleep 0' 15980 1727204183.04374: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204183.04378: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204183.04381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204183.04383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 15980 1727204183.04385: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204183.04387: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 15980 1727204183.04399: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204183.04451: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204183.04478: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204183.04501: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204183.04611: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204183.20909: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 15980 1727204183.22227: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 15980 1727204183.22287: stderr chunk (state=3): >>><<< 15980 1727204183.22291: stdout chunk (state=3): >>><<< 15980 1727204183.22309: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 15980 1727204183.22333: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204182.9201913-19151-218366641209555/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15980 1727204183.22344: _low_level_execute_command(): starting 15980 1727204183.22362: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204182.9201913-19151-218366641209555/ > /dev/null 2>&1 && sleep 0' 15980 1727204183.22845: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204183.22934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204183.22939: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 15980 1727204183.22942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204183.22958: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204183.22996: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204183.23106: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204183.25180: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204183.25185: stdout chunk (state=3): >>><<< 15980 1727204183.25188: stderr chunk (state=3): >>><<< 15980 1727204183.25389: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204183.25400: handler run complete 15980 1727204183.25402: attempt loop complete, returning result 15980 1727204183.25405: _execute() done 15980 1727204183.25407: dumping result to json 15980 1727204183.25409: done dumping result, returning 15980 1727204183.25411: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [127b8e07-fff9-5f1d-4b72-00000000006f] 15980 1727204183.25413: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000006f 15980 1727204183.25668: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000006f 15980 1727204183.25675: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 15980 1727204183.25745: no more pending results, returning what we have 15980 1727204183.25757: results queue empty 15980 1727204183.25759: checking for any_errors_fatal 15980 1727204183.25769: done checking for any_errors_fatal 15980 1727204183.25771: checking for max_fail_percentage 15980 1727204183.25774: done checking for max_fail_percentage 15980 1727204183.25778: checking to see if all hosts have failed and the running result is not ok 15980 1727204183.25779: done checking to see if all hosts have failed 15980 1727204183.25780: getting the remaining hosts for this loop 15980 1727204183.25782: done getting the remaining hosts for this loop 15980 1727204183.25791: getting the next task for host managed-node2 15980 1727204183.25803: done getting next task for host managed-node2 15980 1727204183.25805: ^ task is: TASK: meta (role_complete) 15980 1727204183.25807: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204183.25821: getting variables 15980 1727204183.25823: in VariableManager get_vars() 15980 1727204183.26016: Calling all_inventory to load vars for managed-node2 15980 1727204183.26019: Calling groups_inventory to load vars for managed-node2 15980 1727204183.26021: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204183.26035: Calling all_plugins_play to load vars for managed-node2 15980 1727204183.26037: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204183.26040: Calling groups_plugins_play to load vars for managed-node2 15980 1727204183.28346: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204183.29843: done with get_vars() 15980 1727204183.29884: done getting variables 15980 1727204183.29997: done queuing things up, now waiting for results queue to drain 15980 1727204183.29999: results queue empty 15980 1727204183.30000: checking for any_errors_fatal 15980 1727204183.30003: done checking for any_errors_fatal 15980 1727204183.30004: checking for max_fail_percentage 15980 1727204183.30005: done checking for max_fail_percentage 15980 1727204183.30006: checking to see if all hosts have failed and the running result is not ok 15980 1727204183.30006: done checking to see if all hosts have failed 15980 1727204183.30008: getting the remaining hosts for this loop 15980 1727204183.30009: done getting the remaining hosts for this loop 15980 1727204183.30012: getting the next task for host managed-node2 15980 1727204183.30015: done getting next task for host managed-node2 15980 1727204183.30017: ^ task is: TASK: meta (flush_handlers) 15980 1727204183.30018: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204183.30021: getting variables 15980 1727204183.30022: in VariableManager get_vars() 15980 1727204183.30036: Calling all_inventory to load vars for managed-node2 15980 1727204183.30038: Calling groups_inventory to load vars for managed-node2 15980 1727204183.30040: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204183.30045: Calling all_plugins_play to load vars for managed-node2 15980 1727204183.30047: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204183.30050: Calling groups_plugins_play to load vars for managed-node2 15980 1727204183.31762: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204183.33307: done with get_vars() 15980 1727204183.33346: done getting variables 15980 1727204183.33394: in VariableManager get_vars() 15980 1727204183.33405: Calling all_inventory to load vars for managed-node2 15980 1727204183.33406: Calling groups_inventory to load vars for managed-node2 15980 1727204183.33408: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204183.33412: Calling all_plugins_play to load vars for managed-node2 15980 1727204183.33413: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204183.33415: Calling groups_plugins_play to load vars for managed-node2 15980 1727204183.34586: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204183.36210: done with get_vars() 15980 1727204183.36238: done queuing things up, now waiting for results queue to drain 15980 1727204183.36241: results queue empty 15980 1727204183.36242: checking for any_errors_fatal 15980 1727204183.36244: done checking for any_errors_fatal 15980 1727204183.36244: checking for max_fail_percentage 15980 1727204183.36246: done checking for max_fail_percentage 15980 1727204183.36246: checking to see if all hosts have failed and the running result is not ok 15980 1727204183.36247: done checking to see if all hosts have failed 15980 1727204183.36248: getting the remaining hosts for this loop 15980 1727204183.36249: done getting the remaining hosts for this loop 15980 1727204183.36253: getting the next task for host managed-node2 15980 1727204183.36256: done getting next task for host managed-node2 15980 1727204183.36258: ^ task is: TASK: meta (flush_handlers) 15980 1727204183.36259: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204183.36261: getting variables 15980 1727204183.36262: in VariableManager get_vars() 15980 1727204183.36278: Calling all_inventory to load vars for managed-node2 15980 1727204183.36280: Calling groups_inventory to load vars for managed-node2 15980 1727204183.36282: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204183.36290: Calling all_plugins_play to load vars for managed-node2 15980 1727204183.36292: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204183.36294: Calling groups_plugins_play to load vars for managed-node2 15980 1727204183.37313: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204183.39185: done with get_vars() 15980 1727204183.39221: done getting variables 15980 1727204183.39277: in VariableManager get_vars() 15980 1727204183.39290: Calling all_inventory to load vars for managed-node2 15980 1727204183.39292: Calling groups_inventory to load vars for managed-node2 15980 1727204183.39294: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204183.39299: Calling all_plugins_play to load vars for managed-node2 15980 1727204183.39301: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204183.39304: Calling groups_plugins_play to load vars for managed-node2 15980 1727204183.40827: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204183.42054: done with get_vars() 15980 1727204183.42086: done queuing things up, now waiting for results queue to drain 15980 1727204183.42088: results queue empty 15980 1727204183.42088: checking for any_errors_fatal 15980 1727204183.42089: done checking for any_errors_fatal 15980 1727204183.42090: checking for max_fail_percentage 15980 1727204183.42091: done checking for max_fail_percentage 15980 1727204183.42091: checking to see if all hosts have failed and the running result is not ok 15980 1727204183.42092: done checking to see if all hosts have failed 15980 1727204183.42092: getting the remaining hosts for this loop 15980 1727204183.42093: done getting the remaining hosts for this loop 15980 1727204183.42095: getting the next task for host managed-node2 15980 1727204183.42098: done getting next task for host managed-node2 15980 1727204183.42098: ^ task is: None 15980 1727204183.42099: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204183.42100: done queuing things up, now waiting for results queue to drain 15980 1727204183.42101: results queue empty 15980 1727204183.42101: checking for any_errors_fatal 15980 1727204183.42102: done checking for any_errors_fatal 15980 1727204183.42102: checking for max_fail_percentage 15980 1727204183.42103: done checking for max_fail_percentage 15980 1727204183.42103: checking to see if all hosts have failed and the running result is not ok 15980 1727204183.42104: done checking to see if all hosts have failed 15980 1727204183.42105: getting the next task for host managed-node2 15980 1727204183.42106: done getting next task for host managed-node2 15980 1727204183.42107: ^ task is: None 15980 1727204183.42108: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204183.42149: in VariableManager get_vars() 15980 1727204183.42164: done with get_vars() 15980 1727204183.42171: in VariableManager get_vars() 15980 1727204183.42177: done with get_vars() 15980 1727204183.42181: variable 'omit' from source: magic vars 15980 1727204183.42275: variable 'task' from source: play vars 15980 1727204183.42299: in VariableManager get_vars() 15980 1727204183.42307: done with get_vars() 15980 1727204183.42320: variable 'omit' from source: magic vars PLAY [Run the tasklist tasks/assert_profile_absent.yml] ************************ 15980 1727204183.42540: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15980 1727204183.42561: getting the remaining hosts for this loop 15980 1727204183.42563: done getting the remaining hosts for this loop 15980 1727204183.42564: getting the next task for host managed-node2 15980 1727204183.42568: done getting next task for host managed-node2 15980 1727204183.42570: ^ task is: TASK: Gathering Facts 15980 1727204183.42571: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204183.42572: getting variables 15980 1727204183.42573: in VariableManager get_vars() 15980 1727204183.42579: Calling all_inventory to load vars for managed-node2 15980 1727204183.42580: Calling groups_inventory to load vars for managed-node2 15980 1727204183.42582: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204183.42586: Calling all_plugins_play to load vars for managed-node2 15980 1727204183.42588: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204183.42592: Calling groups_plugins_play to load vars for managed-node2 15980 1727204183.43609: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204183.45055: done with get_vars() 15980 1727204183.45075: done getting variables 15980 1727204183.45111: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Tuesday 24 September 2024 14:56:23 -0400 (0:00:00.607) 0:00:44.861 ***** 15980 1727204183.45132: entering _queue_task() for managed-node2/gather_facts 15980 1727204183.45433: worker is 1 (out of 1 available) 15980 1727204183.45447: exiting _queue_task() for managed-node2/gather_facts 15980 1727204183.45459: done queuing things up, now waiting for results queue to drain 15980 1727204183.45462: waiting for pending results... 15980 1727204183.45740: running TaskExecutor() for managed-node2/TASK: Gathering Facts 15980 1727204183.45894: in run() - task 127b8e07-fff9-5f1d-4b72-00000000046e 15980 1727204183.45899: variable 'ansible_search_path' from source: unknown 15980 1727204183.45922: calling self._execute() 15980 1727204183.46056: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204183.46062: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204183.46097: variable 'omit' from source: magic vars 15980 1727204183.46468: variable 'ansible_distribution_major_version' from source: facts 15980 1727204183.46478: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204183.46485: variable 'omit' from source: magic vars 15980 1727204183.46507: variable 'omit' from source: magic vars 15980 1727204183.46537: variable 'omit' from source: magic vars 15980 1727204183.46574: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204183.46607: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204183.46623: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204183.46644: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204183.46675: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204183.46686: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204183.46689: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204183.46694: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204183.46769: Set connection var ansible_connection to ssh 15980 1727204183.46777: Set connection var ansible_pipelining to False 15980 1727204183.46784: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204183.46790: Set connection var ansible_timeout to 10 15980 1727204183.46795: Set connection var ansible_shell_type to sh 15980 1727204183.46801: Set connection var ansible_shell_executable to /bin/sh 15980 1727204183.46827: variable 'ansible_shell_executable' from source: unknown 15980 1727204183.46830: variable 'ansible_connection' from source: unknown 15980 1727204183.46833: variable 'ansible_module_compression' from source: unknown 15980 1727204183.46836: variable 'ansible_shell_type' from source: unknown 15980 1727204183.46839: variable 'ansible_shell_executable' from source: unknown 15980 1727204183.46842: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204183.46844: variable 'ansible_pipelining' from source: unknown 15980 1727204183.46846: variable 'ansible_timeout' from source: unknown 15980 1727204183.46849: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204183.47039: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15980 1727204183.47061: variable 'omit' from source: magic vars 15980 1727204183.47064: starting attempt loop 15980 1727204183.47071: running the handler 15980 1727204183.47102: variable 'ansible_facts' from source: unknown 15980 1727204183.47113: _low_level_execute_command(): starting 15980 1727204183.47121: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15980 1727204183.47777: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204183.47784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204183.47818: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204183.47876: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204183.47879: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204183.47882: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204183.47968: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204183.49727: stdout chunk (state=3): >>>/root <<< 15980 1727204183.49844: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204183.49898: stderr chunk (state=3): >>><<< 15980 1727204183.49901: stdout chunk (state=3): >>><<< 15980 1727204183.49924: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204183.49941: _low_level_execute_command(): starting 15980 1727204183.49947: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204183.4992375-19246-17439894679652 `" && echo ansible-tmp-1727204183.4992375-19246-17439894679652="` echo /root/.ansible/tmp/ansible-tmp-1727204183.4992375-19246-17439894679652 `" ) && sleep 0' 15980 1727204183.50432: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204183.50436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 15980 1727204183.50439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15980 1727204183.50450: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15980 1727204183.50452: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204183.50495: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204183.50499: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204183.50578: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204183.52553: stdout chunk (state=3): >>>ansible-tmp-1727204183.4992375-19246-17439894679652=/root/.ansible/tmp/ansible-tmp-1727204183.4992375-19246-17439894679652 <<< 15980 1727204183.52671: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204183.52731: stderr chunk (state=3): >>><<< 15980 1727204183.52734: stdout chunk (state=3): >>><<< 15980 1727204183.52753: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204183.4992375-19246-17439894679652=/root/.ansible/tmp/ansible-tmp-1727204183.4992375-19246-17439894679652 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204183.52782: variable 'ansible_module_compression' from source: unknown 15980 1727204183.52826: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15980vtkmnsww/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15980 1727204183.52887: variable 'ansible_facts' from source: unknown 15980 1727204183.53023: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204183.4992375-19246-17439894679652/AnsiballZ_setup.py 15980 1727204183.53144: Sending initial data 15980 1727204183.53147: Sent initial data (153 bytes) 15980 1727204183.53645: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204183.53649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204183.53653: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204183.53655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204183.53710: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204183.53714: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204183.53718: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204183.53787: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204183.55381: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15980 1727204183.55445: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15980 1727204183.55512: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15980vtkmnsww/tmp76k5kz49 /root/.ansible/tmp/ansible-tmp-1727204183.4992375-19246-17439894679652/AnsiballZ_setup.py <<< 15980 1727204183.55520: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204183.4992375-19246-17439894679652/AnsiballZ_setup.py" <<< 15980 1727204183.55586: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15980vtkmnsww/tmp76k5kz49" to remote "/root/.ansible/tmp/ansible-tmp-1727204183.4992375-19246-17439894679652/AnsiballZ_setup.py" <<< 15980 1727204183.55590: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204183.4992375-19246-17439894679652/AnsiballZ_setup.py" <<< 15980 1727204183.56777: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204183.56851: stderr chunk (state=3): >>><<< 15980 1727204183.56854: stdout chunk (state=3): >>><<< 15980 1727204183.56882: done transferring module to remote 15980 1727204183.56893: _low_level_execute_command(): starting 15980 1727204183.56896: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204183.4992375-19246-17439894679652/ /root/.ansible/tmp/ansible-tmp-1727204183.4992375-19246-17439894679652/AnsiballZ_setup.py && sleep 0' 15980 1727204183.57381: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204183.57385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 15980 1727204183.57387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15980 1727204183.57389: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204183.57392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204183.57452: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204183.57456: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204183.57522: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204183.59395: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204183.59400: stderr chunk (state=3): >>><<< 15980 1727204183.59404: stdout chunk (state=3): >>><<< 15980 1727204183.59421: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204183.59428: _low_level_execute_command(): starting 15980 1727204183.59431: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204183.4992375-19246-17439894679652/AnsiballZ_setup.py && sleep 0' 15980 1727204183.59930: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204183.59934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204183.59936: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204183.59939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204183.59994: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204183.59998: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204183.60002: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204183.60080: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204184.26770: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_loadavg": {"1m": 0.7236328125, "5m": 0.53466796875, "15m": 0.26904296875}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "23", "epoch": "1727204183", "epoch_int": "1727204183", "date": "2024-09-24", "time": "14:56:23", "iso8601_micro": "2024-09-24T18:56:23.902734Z", "iso8601": "2024-09-24T18:56:23Z", "iso8601_basic": "20240924T145623902734", "iso8601_basic_short": "20240924T145623", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_local": {}, "ansible_lsb": {}, "ansible_fips": false, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_is_chroot": false, "ansible_iscsi_iqn": "", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::f7:13ff:fe22:8fc1", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.73"], "ansible_all_ipv6_addresses": ["fe80::f7:13ff:fe22:8fc1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.73", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::f7:13ff:fe22:8fc1"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3049, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 667, "free": 3049}, "nocache": {"free": 3479, "used": 237}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_uuid": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 530, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251325550592, "block_size": 4096, "block_total": 64479564, "block_available": 61358777, "block_used": 3120787, "inode_total": 16384000, "inode_available": 16301510, "inode_used": 82490, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15980 1727204184.28973: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204184.28978: stderr chunk (state=3): >>>Shared connection to 10.31.47.73 closed. <<< 15980 1727204184.29009: stderr chunk (state=3): >>><<< 15980 1727204184.29013: stdout chunk (state=3): >>><<< 15980 1727204184.29048: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_loadavg": {"1m": 0.7236328125, "5m": 0.53466796875, "15m": 0.26904296875}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "23", "epoch": "1727204183", "epoch_int": "1727204183", "date": "2024-09-24", "time": "14:56:23", "iso8601_micro": "2024-09-24T18:56:23.902734Z", "iso8601": "2024-09-24T18:56:23Z", "iso8601_basic": "20240924T145623902734", "iso8601_basic_short": "20240924T145623", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_local": {}, "ansible_lsb": {}, "ansible_fips": false, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_is_chroot": false, "ansible_iscsi_iqn": "", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::f7:13ff:fe22:8fc1", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.73"], "ansible_all_ipv6_addresses": ["fe80::f7:13ff:fe22:8fc1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.73", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::f7:13ff:fe22:8fc1"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3049, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 667, "free": 3049}, "nocache": {"free": 3479, "used": 237}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_uuid": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 530, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251325550592, "block_size": 4096, "block_total": 64479564, "block_available": 61358777, "block_used": 3120787, "inode_total": 16384000, "inode_available": 16301510, "inode_used": 82490, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 15980 1727204184.30361: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204183.4992375-19246-17439894679652/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15980 1727204184.30367: _low_level_execute_command(): starting 15980 1727204184.30370: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204183.4992375-19246-17439894679652/ > /dev/null 2>&1 && sleep 0' 15980 1727204184.31777: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204184.31782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204184.31784: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204184.31786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204184.32062: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204184.32073: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204184.32290: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204184.34218: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204184.34222: stderr chunk (state=3): >>><<< 15980 1727204184.34227: stdout chunk (state=3): >>><<< 15980 1727204184.34249: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204184.34267: handler run complete 15980 1727204184.34656: variable 'ansible_facts' from source: unknown 15980 1727204184.35085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204184.36123: variable 'ansible_facts' from source: unknown 15980 1727204184.36461: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204184.37093: attempt loop complete, returning result 15980 1727204184.37097: _execute() done 15980 1727204184.37100: dumping result to json 15980 1727204184.37102: done dumping result, returning 15980 1727204184.37105: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [127b8e07-fff9-5f1d-4b72-00000000046e] 15980 1727204184.37107: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000046e 15980 1727204184.38273: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000046e 15980 1727204184.38278: WORKER PROCESS EXITING ok: [managed-node2] 15980 1727204184.38862: no more pending results, returning what we have 15980 1727204184.38868: results queue empty 15980 1727204184.38869: checking for any_errors_fatal 15980 1727204184.38871: done checking for any_errors_fatal 15980 1727204184.38872: checking for max_fail_percentage 15980 1727204184.38874: done checking for max_fail_percentage 15980 1727204184.38875: checking to see if all hosts have failed and the running result is not ok 15980 1727204184.38876: done checking to see if all hosts have failed 15980 1727204184.38877: getting the remaining hosts for this loop 15980 1727204184.38878: done getting the remaining hosts for this loop 15980 1727204184.38882: getting the next task for host managed-node2 15980 1727204184.38887: done getting next task for host managed-node2 15980 1727204184.38889: ^ task is: TASK: meta (flush_handlers) 15980 1727204184.38891: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204184.38895: getting variables 15980 1727204184.38897: in VariableManager get_vars() 15980 1727204184.38921: Calling all_inventory to load vars for managed-node2 15980 1727204184.38927: Calling groups_inventory to load vars for managed-node2 15980 1727204184.38931: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204184.38945: Calling all_plugins_play to load vars for managed-node2 15980 1727204184.38948: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204184.38951: Calling groups_plugins_play to load vars for managed-node2 15980 1727204184.44452: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204184.49932: done with get_vars() 15980 1727204184.49976: done getting variables 15980 1727204184.50062: in VariableManager get_vars() 15980 1727204184.50481: Calling all_inventory to load vars for managed-node2 15980 1727204184.50484: Calling groups_inventory to load vars for managed-node2 15980 1727204184.50487: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204184.50495: Calling all_plugins_play to load vars for managed-node2 15980 1727204184.50497: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204184.50501: Calling groups_plugins_play to load vars for managed-node2 15980 1727204184.55058: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204184.59678: done with get_vars() 15980 1727204184.59730: done queuing things up, now waiting for results queue to drain 15980 1727204184.59733: results queue empty 15980 1727204184.59733: checking for any_errors_fatal 15980 1727204184.59739: done checking for any_errors_fatal 15980 1727204184.59739: checking for max_fail_percentage 15980 1727204184.59741: done checking for max_fail_percentage 15980 1727204184.59749: checking to see if all hosts have failed and the running result is not ok 15980 1727204184.59750: done checking to see if all hosts have failed 15980 1727204184.59751: getting the remaining hosts for this loop 15980 1727204184.59752: done getting the remaining hosts for this loop 15980 1727204184.59755: getting the next task for host managed-node2 15980 1727204184.59760: done getting next task for host managed-node2 15980 1727204184.59762: ^ task is: TASK: Include the task '{{ task }}' 15980 1727204184.59764: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204184.59768: getting variables 15980 1727204184.59769: in VariableManager get_vars() 15980 1727204184.59780: Calling all_inventory to load vars for managed-node2 15980 1727204184.59783: Calling groups_inventory to load vars for managed-node2 15980 1727204184.59785: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204184.59792: Calling all_plugins_play to load vars for managed-node2 15980 1727204184.59795: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204184.59798: Calling groups_plugins_play to load vars for managed-node2 15980 1727204184.63432: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204184.81284: done with get_vars() 15980 1727204184.81323: done getting variables 15980 1727204184.81712: variable 'task' from source: play vars TASK [Include the task 'tasks/assert_profile_absent.yml'] ********************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:6 Tuesday 24 September 2024 14:56:24 -0400 (0:00:01.366) 0:00:46.227 ***** 15980 1727204184.81744: entering _queue_task() for managed-node2/include_tasks 15980 1727204184.82838: worker is 1 (out of 1 available) 15980 1727204184.82855: exiting _queue_task() for managed-node2/include_tasks 15980 1727204184.83072: done queuing things up, now waiting for results queue to drain 15980 1727204184.83075: waiting for pending results... 15980 1727204184.83686: running TaskExecutor() for managed-node2/TASK: Include the task 'tasks/assert_profile_absent.yml' 15980 1727204184.84275: in run() - task 127b8e07-fff9-5f1d-4b72-000000000073 15980 1727204184.84282: variable 'ansible_search_path' from source: unknown 15980 1727204184.84285: calling self._execute() 15980 1727204184.84839: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204184.84844: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204184.84848: variable 'omit' from source: magic vars 15980 1727204184.85948: variable 'ansible_distribution_major_version' from source: facts 15980 1727204184.85959: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204184.85970: variable 'task' from source: play vars 15980 1727204184.86049: variable 'task' from source: play vars 15980 1727204184.86058: _execute() done 15980 1727204184.86063: dumping result to json 15980 1727204184.86069: done dumping result, returning 15980 1727204184.86773: done running TaskExecutor() for managed-node2/TASK: Include the task 'tasks/assert_profile_absent.yml' [127b8e07-fff9-5f1d-4b72-000000000073] 15980 1727204184.86777: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000073 15980 1727204184.86884: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000073 15980 1727204184.86887: WORKER PROCESS EXITING 15980 1727204184.86919: no more pending results, returning what we have 15980 1727204184.86927: in VariableManager get_vars() 15980 1727204184.86968: Calling all_inventory to load vars for managed-node2 15980 1727204184.86972: Calling groups_inventory to load vars for managed-node2 15980 1727204184.86976: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204184.86993: Calling all_plugins_play to load vars for managed-node2 15980 1727204184.86996: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204184.87000: Calling groups_plugins_play to load vars for managed-node2 15980 1727204184.90560: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204184.94846: done with get_vars() 15980 1727204184.95100: variable 'ansible_search_path' from source: unknown 15980 1727204184.95118: we have included files to process 15980 1727204184.95119: generating all_blocks data 15980 1727204184.95121: done generating all_blocks data 15980 1727204184.95121: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 15980 1727204184.95123: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 15980 1727204184.95128: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 15980 1727204184.95524: in VariableManager get_vars() 15980 1727204184.95546: done with get_vars() 15980 1727204184.95879: done processing included file 15980 1727204184.95881: iterating over new_blocks loaded from include file 15980 1727204184.95883: in VariableManager get_vars() 15980 1727204184.95898: done with get_vars() 15980 1727204184.95900: filtering new block on tags 15980 1727204184.95920: done filtering new block on tags 15980 1727204184.95923: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed-node2 15980 1727204184.95931: extending task lists for all hosts with included blocks 15980 1727204184.95969: done extending task lists 15980 1727204184.95971: done processing included files 15980 1727204184.95971: results queue empty 15980 1727204184.95972: checking for any_errors_fatal 15980 1727204184.95974: done checking for any_errors_fatal 15980 1727204184.95975: checking for max_fail_percentage 15980 1727204184.95976: done checking for max_fail_percentage 15980 1727204184.95977: checking to see if all hosts have failed and the running result is not ok 15980 1727204184.95978: done checking to see if all hosts have failed 15980 1727204184.95978: getting the remaining hosts for this loop 15980 1727204184.95980: done getting the remaining hosts for this loop 15980 1727204184.95982: getting the next task for host managed-node2 15980 1727204184.95986: done getting next task for host managed-node2 15980 1727204184.95988: ^ task is: TASK: Include the task 'get_profile_stat.yml' 15980 1727204184.95990: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204184.95993: getting variables 15980 1727204184.95994: in VariableManager get_vars() 15980 1727204184.96004: Calling all_inventory to load vars for managed-node2 15980 1727204184.96006: Calling groups_inventory to load vars for managed-node2 15980 1727204184.96009: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204184.96015: Calling all_plugins_play to load vars for managed-node2 15980 1727204184.96017: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204184.96020: Calling groups_plugins_play to load vars for managed-node2 15980 1727204184.99559: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204185.04059: done with get_vars() 15980 1727204185.04305: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Tuesday 24 September 2024 14:56:25 -0400 (0:00:00.228) 0:00:46.456 ***** 15980 1727204185.04605: entering _queue_task() for managed-node2/include_tasks 15980 1727204185.05211: worker is 1 (out of 1 available) 15980 1727204185.05230: exiting _queue_task() for managed-node2/include_tasks 15980 1727204185.05244: done queuing things up, now waiting for results queue to drain 15980 1727204185.05246: waiting for pending results... 15980 1727204185.05987: running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' 15980 1727204185.06413: in run() - task 127b8e07-fff9-5f1d-4b72-00000000047f 15980 1727204185.06429: variable 'ansible_search_path' from source: unknown 15980 1727204185.06433: variable 'ansible_search_path' from source: unknown 15980 1727204185.06588: calling self._execute() 15980 1727204185.07072: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204185.07076: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204185.07080: variable 'omit' from source: magic vars 15980 1727204185.08376: variable 'ansible_distribution_major_version' from source: facts 15980 1727204185.08596: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204185.08601: _execute() done 15980 1727204185.08605: dumping result to json 15980 1727204185.08608: done dumping result, returning 15980 1727204185.08754: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' [127b8e07-fff9-5f1d-4b72-00000000047f] 15980 1727204185.08759: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000047f 15980 1727204185.08970: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000047f 15980 1727204185.08977: WORKER PROCESS EXITING 15980 1727204185.09011: no more pending results, returning what we have 15980 1727204185.09017: in VariableManager get_vars() 15980 1727204185.09061: Calling all_inventory to load vars for managed-node2 15980 1727204185.09068: Calling groups_inventory to load vars for managed-node2 15980 1727204185.09072: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204185.09090: Calling all_plugins_play to load vars for managed-node2 15980 1727204185.09093: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204185.09096: Calling groups_plugins_play to load vars for managed-node2 15980 1727204185.13117: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204185.17778: done with get_vars() 15980 1727204185.17816: variable 'ansible_search_path' from source: unknown 15980 1727204185.17818: variable 'ansible_search_path' from source: unknown 15980 1727204185.17833: variable 'task' from source: play vars 15980 1727204185.17953: variable 'task' from source: play vars 15980 1727204185.18201: we have included files to process 15980 1727204185.18202: generating all_blocks data 15980 1727204185.18204: done generating all_blocks data 15980 1727204185.18205: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15980 1727204185.18207: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15980 1727204185.18209: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15980 1727204185.20209: done processing included file 15980 1727204185.20212: iterating over new_blocks loaded from include file 15980 1727204185.20214: in VariableManager get_vars() 15980 1727204185.20233: done with get_vars() 15980 1727204185.20235: filtering new block on tags 15980 1727204185.20262: done filtering new block on tags 15980 1727204185.20570: in VariableManager get_vars() 15980 1727204185.20588: done with get_vars() 15980 1727204185.20591: filtering new block on tags 15980 1727204185.20617: done filtering new block on tags 15980 1727204185.20619: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node2 15980 1727204185.20629: extending task lists for all hosts with included blocks 15980 1727204185.20750: done extending task lists 15980 1727204185.20752: done processing included files 15980 1727204185.20752: results queue empty 15980 1727204185.20753: checking for any_errors_fatal 15980 1727204185.20757: done checking for any_errors_fatal 15980 1727204185.20758: checking for max_fail_percentage 15980 1727204185.20759: done checking for max_fail_percentage 15980 1727204185.20760: checking to see if all hosts have failed and the running result is not ok 15980 1727204185.20761: done checking to see if all hosts have failed 15980 1727204185.20762: getting the remaining hosts for this loop 15980 1727204185.20763: done getting the remaining hosts for this loop 15980 1727204185.20969: getting the next task for host managed-node2 15980 1727204185.20975: done getting next task for host managed-node2 15980 1727204185.20978: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 15980 1727204185.20981: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204185.20984: getting variables 15980 1727204185.20985: in VariableManager get_vars() 15980 1727204185.20997: Calling all_inventory to load vars for managed-node2 15980 1727204185.21000: Calling groups_inventory to load vars for managed-node2 15980 1727204185.21002: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204185.21009: Calling all_plugins_play to load vars for managed-node2 15980 1727204185.21011: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204185.21014: Calling groups_plugins_play to load vars for managed-node2 15980 1727204185.24236: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204185.27851: done with get_vars() 15980 1727204185.27898: done getting variables 15980 1727204185.27957: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:56:25 -0400 (0:00:00.233) 0:00:46.690 ***** 15980 1727204185.27994: entering _queue_task() for managed-node2/set_fact 15980 1727204185.28506: worker is 1 (out of 1 available) 15980 1727204185.28519: exiting _queue_task() for managed-node2/set_fact 15980 1727204185.28534: done queuing things up, now waiting for results queue to drain 15980 1727204185.28536: waiting for pending results... 15980 1727204185.28784: running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag 15980 1727204185.29030: in run() - task 127b8e07-fff9-5f1d-4b72-00000000048a 15980 1727204185.29034: variable 'ansible_search_path' from source: unknown 15980 1727204185.29037: variable 'ansible_search_path' from source: unknown 15980 1727204185.29040: calling self._execute() 15980 1727204185.29142: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204185.29158: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204185.29177: variable 'omit' from source: magic vars 15980 1727204185.29647: variable 'ansible_distribution_major_version' from source: facts 15980 1727204185.29683: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204185.29686: variable 'omit' from source: magic vars 15980 1727204185.29772: variable 'omit' from source: magic vars 15980 1727204185.29797: variable 'omit' from source: magic vars 15980 1727204185.29851: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204185.29906: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204185.30012: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204185.30015: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204185.30017: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204185.30022: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204185.30035: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204185.30044: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204185.30165: Set connection var ansible_connection to ssh 15980 1727204185.30245: Set connection var ansible_pipelining to False 15980 1727204185.30259: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204185.30372: Set connection var ansible_timeout to 10 15980 1727204185.30376: Set connection var ansible_shell_type to sh 15980 1727204185.30378: Set connection var ansible_shell_executable to /bin/sh 15980 1727204185.30400: variable 'ansible_shell_executable' from source: unknown 15980 1727204185.30408: variable 'ansible_connection' from source: unknown 15980 1727204185.30416: variable 'ansible_module_compression' from source: unknown 15980 1727204185.30424: variable 'ansible_shell_type' from source: unknown 15980 1727204185.30591: variable 'ansible_shell_executable' from source: unknown 15980 1727204185.30595: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204185.30597: variable 'ansible_pipelining' from source: unknown 15980 1727204185.30600: variable 'ansible_timeout' from source: unknown 15980 1727204185.30602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204185.30873: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15980 1727204185.30901: variable 'omit' from source: magic vars 15980 1727204185.30912: starting attempt loop 15980 1727204185.30919: running the handler 15980 1727204185.30940: handler run complete 15980 1727204185.30989: attempt loop complete, returning result 15980 1727204185.31172: _execute() done 15980 1727204185.31175: dumping result to json 15980 1727204185.31178: done dumping result, returning 15980 1727204185.31181: done running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag [127b8e07-fff9-5f1d-4b72-00000000048a] 15980 1727204185.31183: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000048a 15980 1727204185.31578: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000048a 15980 1727204185.31583: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 15980 1727204185.31649: no more pending results, returning what we have 15980 1727204185.31653: results queue empty 15980 1727204185.31654: checking for any_errors_fatal 15980 1727204185.31656: done checking for any_errors_fatal 15980 1727204185.31656: checking for max_fail_percentage 15980 1727204185.31658: done checking for max_fail_percentage 15980 1727204185.31659: checking to see if all hosts have failed and the running result is not ok 15980 1727204185.31660: done checking to see if all hosts have failed 15980 1727204185.31661: getting the remaining hosts for this loop 15980 1727204185.31663: done getting the remaining hosts for this loop 15980 1727204185.31670: getting the next task for host managed-node2 15980 1727204185.31678: done getting next task for host managed-node2 15980 1727204185.31682: ^ task is: TASK: Stat profile file 15980 1727204185.31686: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204185.31693: getting variables 15980 1727204185.31694: in VariableManager get_vars() 15980 1727204185.31731: Calling all_inventory to load vars for managed-node2 15980 1727204185.31734: Calling groups_inventory to load vars for managed-node2 15980 1727204185.31739: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204185.31753: Calling all_plugins_play to load vars for managed-node2 15980 1727204185.31756: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204185.31760: Calling groups_plugins_play to load vars for managed-node2 15980 1727204185.34024: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204185.37315: done with get_vars() 15980 1727204185.37357: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:56:25 -0400 (0:00:00.094) 0:00:46.784 ***** 15980 1727204185.37467: entering _queue_task() for managed-node2/stat 15980 1727204185.37862: worker is 1 (out of 1 available) 15980 1727204185.37982: exiting _queue_task() for managed-node2/stat 15980 1727204185.37994: done queuing things up, now waiting for results queue to drain 15980 1727204185.37997: waiting for pending results... 15980 1727204185.38220: running TaskExecutor() for managed-node2/TASK: Stat profile file 15980 1727204185.38363: in run() - task 127b8e07-fff9-5f1d-4b72-00000000048b 15980 1727204185.38446: variable 'ansible_search_path' from source: unknown 15980 1727204185.38449: variable 'ansible_search_path' from source: unknown 15980 1727204185.38452: calling self._execute() 15980 1727204185.38536: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204185.38549: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204185.38575: variable 'omit' from source: magic vars 15980 1727204185.39473: variable 'ansible_distribution_major_version' from source: facts 15980 1727204185.39478: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204185.39481: variable 'omit' from source: magic vars 15980 1727204185.39890: variable 'omit' from source: magic vars 15980 1727204185.40040: variable 'profile' from source: play vars 15980 1727204185.40333: variable 'interface' from source: set_fact 15980 1727204185.40771: variable 'interface' from source: set_fact 15980 1727204185.40775: variable 'omit' from source: magic vars 15980 1727204185.40778: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204185.40781: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204185.40783: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204185.40785: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204185.40788: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204185.41209: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204185.41221: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204185.41234: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204185.41354: Set connection var ansible_connection to ssh 15980 1727204185.41974: Set connection var ansible_pipelining to False 15980 1727204185.41978: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204185.41981: Set connection var ansible_timeout to 10 15980 1727204185.41984: Set connection var ansible_shell_type to sh 15980 1727204185.41987: Set connection var ansible_shell_executable to /bin/sh 15980 1727204185.41990: variable 'ansible_shell_executable' from source: unknown 15980 1727204185.41992: variable 'ansible_connection' from source: unknown 15980 1727204185.41993: variable 'ansible_module_compression' from source: unknown 15980 1727204185.41995: variable 'ansible_shell_type' from source: unknown 15980 1727204185.41998: variable 'ansible_shell_executable' from source: unknown 15980 1727204185.42000: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204185.42002: variable 'ansible_pipelining' from source: unknown 15980 1727204185.42004: variable 'ansible_timeout' from source: unknown 15980 1727204185.42006: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204185.42531: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15980 1727204185.42810: variable 'omit' from source: magic vars 15980 1727204185.42827: starting attempt loop 15980 1727204185.42836: running the handler 15980 1727204185.42862: _low_level_execute_command(): starting 15980 1727204185.42908: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15980 1727204185.44914: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204185.44920: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204185.45014: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204185.45123: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204185.46995: stdout chunk (state=3): >>>/root <<< 15980 1727204185.46999: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204185.47206: stderr chunk (state=3): >>><<< 15980 1727204185.47211: stdout chunk (state=3): >>><<< 15980 1727204185.47241: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204185.47261: _low_level_execute_command(): starting 15980 1727204185.47530: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204185.4724815-19348-45258690450253 `" && echo ansible-tmp-1727204185.4724815-19348-45258690450253="` echo /root/.ansible/tmp/ansible-tmp-1727204185.4724815-19348-45258690450253 `" ) && sleep 0' 15980 1727204185.49190: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204185.49196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 15980 1727204185.49390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204185.49533: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204185.49591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 15980 1727204185.49595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204185.49650: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204185.49662: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204185.49881: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204185.49975: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204185.52064: stdout chunk (state=3): >>>ansible-tmp-1727204185.4724815-19348-45258690450253=/root/.ansible/tmp/ansible-tmp-1727204185.4724815-19348-45258690450253 <<< 15980 1727204185.52220: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204185.52304: stderr chunk (state=3): >>><<< 15980 1727204185.52561: stdout chunk (state=3): >>><<< 15980 1727204185.52567: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204185.4724815-19348-45258690450253=/root/.ansible/tmp/ansible-tmp-1727204185.4724815-19348-45258690450253 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204185.52571: variable 'ansible_module_compression' from source: unknown 15980 1727204185.52632: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15980vtkmnsww/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 15980 1727204185.52891: variable 'ansible_facts' from source: unknown 15980 1727204185.52990: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204185.4724815-19348-45258690450253/AnsiballZ_stat.py 15980 1727204185.53433: Sending initial data 15980 1727204185.53437: Sent initial data (152 bytes) 15980 1727204185.54930: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204185.55177: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204185.55194: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204185.55210: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204185.55386: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204185.57162: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15980 1727204185.57231: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15980 1727204185.57573: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204185.4724815-19348-45258690450253/AnsiballZ_stat.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15980vtkmnsww/tmp2olw9wpa" to remote "/root/.ansible/tmp/ansible-tmp-1727204185.4724815-19348-45258690450253/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204185.4724815-19348-45258690450253/AnsiballZ_stat.py" <<< 15980 1727204185.57577: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15980vtkmnsww/tmp2olw9wpa /root/.ansible/tmp/ansible-tmp-1727204185.4724815-19348-45258690450253/AnsiballZ_stat.py <<< 15980 1727204185.59046: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204185.59063: stderr chunk (state=3): >>><<< 15980 1727204185.59076: stdout chunk (state=3): >>><<< 15980 1727204185.59111: done transferring module to remote 15980 1727204185.59133: _low_level_execute_command(): starting 15980 1727204185.59151: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204185.4724815-19348-45258690450253/ /root/.ansible/tmp/ansible-tmp-1727204185.4724815-19348-45258690450253/AnsiballZ_stat.py && sleep 0' 15980 1727204185.60567: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 15980 1727204185.60649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204185.60758: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204185.60776: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204185.60962: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204185.60983: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204185.62825: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204185.63001: stderr chunk (state=3): >>><<< 15980 1727204185.63006: stdout chunk (state=3): >>><<< 15980 1727204185.63164: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204185.63172: _low_level_execute_command(): starting 15980 1727204185.63178: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204185.4724815-19348-45258690450253/AnsiballZ_stat.py && sleep 0' 15980 1727204185.64186: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15980 1727204185.64244: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204185.64349: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204185.64396: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204185.64792: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204185.81422: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 15980 1727204185.82790: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 15980 1727204185.82795: stdout chunk (state=3): >>><<< 15980 1727204185.82797: stderr chunk (state=3): >>><<< 15980 1727204185.82817: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 15980 1727204185.82975: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204185.4724815-19348-45258690450253/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15980 1727204185.82992: _low_level_execute_command(): starting 15980 1727204185.83272: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204185.4724815-19348-45258690450253/ > /dev/null 2>&1 && sleep 0' 15980 1727204185.85178: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204185.85316: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204185.85588: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204185.85702: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204185.86032: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204185.87867: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204185.87881: stdout chunk (state=3): >>><<< 15980 1727204185.87900: stderr chunk (state=3): >>><<< 15980 1727204185.88173: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204185.88179: handler run complete 15980 1727204185.88182: attempt loop complete, returning result 15980 1727204185.88185: _execute() done 15980 1727204185.88187: dumping result to json 15980 1727204185.88189: done dumping result, returning 15980 1727204185.88328: done running TaskExecutor() for managed-node2/TASK: Stat profile file [127b8e07-fff9-5f1d-4b72-00000000048b] 15980 1727204185.88332: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000048b ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 15980 1727204185.88619: no more pending results, returning what we have 15980 1727204185.88622: results queue empty 15980 1727204185.88623: checking for any_errors_fatal 15980 1727204185.88639: done checking for any_errors_fatal 15980 1727204185.88640: checking for max_fail_percentage 15980 1727204185.88642: done checking for max_fail_percentage 15980 1727204185.88643: checking to see if all hosts have failed and the running result is not ok 15980 1727204185.88644: done checking to see if all hosts have failed 15980 1727204185.88646: getting the remaining hosts for this loop 15980 1727204185.88648: done getting the remaining hosts for this loop 15980 1727204185.88653: getting the next task for host managed-node2 15980 1727204185.88661: done getting next task for host managed-node2 15980 1727204185.88663: ^ task is: TASK: Set NM profile exist flag based on the profile files 15980 1727204185.88670: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204185.88676: getting variables 15980 1727204185.88678: in VariableManager get_vars() 15980 1727204185.88710: Calling all_inventory to load vars for managed-node2 15980 1727204185.88715: Calling groups_inventory to load vars for managed-node2 15980 1727204185.88719: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204185.89078: Calling all_plugins_play to load vars for managed-node2 15980 1727204185.89084: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204185.89095: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000048b 15980 1727204185.89098: WORKER PROCESS EXITING 15980 1727204185.89104: Calling groups_plugins_play to load vars for managed-node2 15980 1727204185.93402: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204185.99468: done with get_vars() 15980 1727204185.99513: done getting variables 15980 1727204185.99587: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:56:25 -0400 (0:00:00.621) 0:00:47.406 ***** 15980 1727204185.99624: entering _queue_task() for managed-node2/set_fact 15980 1727204186.00847: worker is 1 (out of 1 available) 15980 1727204186.00863: exiting _queue_task() for managed-node2/set_fact 15980 1727204186.01281: done queuing things up, now waiting for results queue to drain 15980 1727204186.01284: waiting for pending results... 15980 1727204186.01604: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files 15980 1727204186.01872: in run() - task 127b8e07-fff9-5f1d-4b72-00000000048c 15980 1727204186.02103: variable 'ansible_search_path' from source: unknown 15980 1727204186.02108: variable 'ansible_search_path' from source: unknown 15980 1727204186.02112: calling self._execute() 15980 1727204186.02304: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204186.02429: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204186.02433: variable 'omit' from source: magic vars 15980 1727204186.03373: variable 'ansible_distribution_major_version' from source: facts 15980 1727204186.03399: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204186.03642: variable 'profile_stat' from source: set_fact 15980 1727204186.03777: Evaluated conditional (profile_stat.stat.exists): False 15980 1727204186.03788: when evaluation is False, skipping this task 15980 1727204186.03830: _execute() done 15980 1727204186.03834: dumping result to json 15980 1727204186.03837: done dumping result, returning 15980 1727204186.03848: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files [127b8e07-fff9-5f1d-4b72-00000000048c] 15980 1727204186.03882: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000048c 15980 1727204186.04306: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000048c skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15980 1727204186.04364: no more pending results, returning what we have 15980 1727204186.04370: results queue empty 15980 1727204186.04371: checking for any_errors_fatal 15980 1727204186.04383: done checking for any_errors_fatal 15980 1727204186.04384: checking for max_fail_percentage 15980 1727204186.04385: done checking for max_fail_percentage 15980 1727204186.04386: checking to see if all hosts have failed and the running result is not ok 15980 1727204186.04387: done checking to see if all hosts have failed 15980 1727204186.04388: getting the remaining hosts for this loop 15980 1727204186.04390: done getting the remaining hosts for this loop 15980 1727204186.04394: getting the next task for host managed-node2 15980 1727204186.04402: done getting next task for host managed-node2 15980 1727204186.04404: ^ task is: TASK: Get NM profile info 15980 1727204186.04410: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204186.04414: getting variables 15980 1727204186.04417: in VariableManager get_vars() 15980 1727204186.04453: Calling all_inventory to load vars for managed-node2 15980 1727204186.04457: Calling groups_inventory to load vars for managed-node2 15980 1727204186.04461: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204186.04472: WORKER PROCESS EXITING 15980 1727204186.04682: Calling all_plugins_play to load vars for managed-node2 15980 1727204186.04687: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204186.04692: Calling groups_plugins_play to load vars for managed-node2 15980 1727204186.08786: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204186.13355: done with get_vars() 15980 1727204186.13400: done getting variables 15980 1727204186.13674: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:56:26 -0400 (0:00:00.140) 0:00:47.547 ***** 15980 1727204186.13712: entering _queue_task() for managed-node2/shell 15980 1727204186.14575: worker is 1 (out of 1 available) 15980 1727204186.14591: exiting _queue_task() for managed-node2/shell 15980 1727204186.14605: done queuing things up, now waiting for results queue to drain 15980 1727204186.14608: waiting for pending results... 15980 1727204186.15537: running TaskExecutor() for managed-node2/TASK: Get NM profile info 15980 1727204186.15919: in run() - task 127b8e07-fff9-5f1d-4b72-00000000048d 15980 1727204186.15925: variable 'ansible_search_path' from source: unknown 15980 1727204186.15929: variable 'ansible_search_path' from source: unknown 15980 1727204186.16025: calling self._execute() 15980 1727204186.16353: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204186.16357: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204186.16360: variable 'omit' from source: magic vars 15980 1727204186.17235: variable 'ansible_distribution_major_version' from source: facts 15980 1727204186.17255: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204186.17268: variable 'omit' from source: magic vars 15980 1727204186.17323: variable 'omit' from source: magic vars 15980 1727204186.17451: variable 'profile' from source: play vars 15980 1727204186.17455: variable 'interface' from source: set_fact 15980 1727204186.17527: variable 'interface' from source: set_fact 15980 1727204186.17550: variable 'omit' from source: magic vars 15980 1727204186.17598: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204186.17641: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204186.17666: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204186.17685: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204186.17703: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204186.17738: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204186.17742: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204186.17744: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204186.17857: Set connection var ansible_connection to ssh 15980 1727204186.17872: Set connection var ansible_pipelining to False 15980 1727204186.17875: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204186.17881: Set connection var ansible_timeout to 10 15980 1727204186.17887: Set connection var ansible_shell_type to sh 15980 1727204186.17893: Set connection var ansible_shell_executable to /bin/sh 15980 1727204186.17932: variable 'ansible_shell_executable' from source: unknown 15980 1727204186.17935: variable 'ansible_connection' from source: unknown 15980 1727204186.17938: variable 'ansible_module_compression' from source: unknown 15980 1727204186.17940: variable 'ansible_shell_type' from source: unknown 15980 1727204186.17943: variable 'ansible_shell_executable' from source: unknown 15980 1727204186.17945: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204186.17985: variable 'ansible_pipelining' from source: unknown 15980 1727204186.17989: variable 'ansible_timeout' from source: unknown 15980 1727204186.17992: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204186.18118: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15980 1727204186.18203: variable 'omit' from source: magic vars 15980 1727204186.18206: starting attempt loop 15980 1727204186.18210: running the handler 15980 1727204186.18213: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15980 1727204186.18216: _low_level_execute_command(): starting 15980 1727204186.18218: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15980 1727204186.18999: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204186.19010: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204186.19021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204186.19041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15980 1727204186.19153: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204186.19171: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204186.19215: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204186.19220: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204186.19327: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204186.21075: stdout chunk (state=3): >>>/root <<< 15980 1727204186.21396: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204186.21401: stdout chunk (state=3): >>><<< 15980 1727204186.21403: stderr chunk (state=3): >>><<< 15980 1727204186.21505: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204186.21509: _low_level_execute_command(): starting 15980 1727204186.21512: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204186.2146387-19374-129454912903843 `" && echo ansible-tmp-1727204186.2146387-19374-129454912903843="` echo /root/.ansible/tmp/ansible-tmp-1727204186.2146387-19374-129454912903843 `" ) && sleep 0' 15980 1727204186.22701: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204186.22888: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204186.23029: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204186.23136: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204186.25122: stdout chunk (state=3): >>>ansible-tmp-1727204186.2146387-19374-129454912903843=/root/.ansible/tmp/ansible-tmp-1727204186.2146387-19374-129454912903843 <<< 15980 1727204186.25495: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204186.25498: stdout chunk (state=3): >>><<< 15980 1727204186.25501: stderr chunk (state=3): >>><<< 15980 1727204186.25504: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204186.2146387-19374-129454912903843=/root/.ansible/tmp/ansible-tmp-1727204186.2146387-19374-129454912903843 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204186.25506: variable 'ansible_module_compression' from source: unknown 15980 1727204186.25508: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15980vtkmnsww/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15980 1727204186.25510: variable 'ansible_facts' from source: unknown 15980 1727204186.25557: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204186.2146387-19374-129454912903843/AnsiballZ_command.py 15980 1727204186.25819: Sending initial data 15980 1727204186.25828: Sent initial data (156 bytes) 15980 1727204186.26998: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 15980 1727204186.27269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204186.27274: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204186.27501: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204186.27511: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204186.27696: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204186.29308: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 15980 1727204186.29315: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 15980 1727204186.29322: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 15980 1727204186.29330: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 15980 1727204186.29337: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 15980 1727204186.29344: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 15980 1727204186.29350: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 <<< 15980 1727204186.29357: stderr chunk (state=3): >>>debug2: Server supports extension "limits@openssh.com" revision 1 <<< 15980 1727204186.29364: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 15980 1727204186.29372: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 <<< 15980 1727204186.29392: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15980 1727204186.29469: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15980 1727204186.29554: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15980vtkmnsww/tmpor7v0ou2 /root/.ansible/tmp/ansible-tmp-1727204186.2146387-19374-129454912903843/AnsiballZ_command.py <<< 15980 1727204186.29557: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204186.2146387-19374-129454912903843/AnsiballZ_command.py" <<< 15980 1727204186.29637: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15980vtkmnsww/tmpor7v0ou2" to remote "/root/.ansible/tmp/ansible-tmp-1727204186.2146387-19374-129454912903843/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204186.2146387-19374-129454912903843/AnsiballZ_command.py" <<< 15980 1727204186.30732: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204186.30736: stdout chunk (state=3): >>><<< 15980 1727204186.30739: stderr chunk (state=3): >>><<< 15980 1727204186.30741: done transferring module to remote 15980 1727204186.30743: _low_level_execute_command(): starting 15980 1727204186.30745: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204186.2146387-19374-129454912903843/ /root/.ansible/tmp/ansible-tmp-1727204186.2146387-19374-129454912903843/AnsiballZ_command.py && sleep 0' 15980 1727204186.31374: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204186.31384: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204186.31388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204186.31390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15980 1727204186.31393: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 15980 1727204186.31395: stderr chunk (state=3): >>>debug2: match not found <<< 15980 1727204186.31397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204186.31400: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15980 1727204186.31402: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 15980 1727204186.31416: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15980 1727204186.31483: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204186.31500: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204186.31519: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204186.31613: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204186.33473: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204186.33610: stderr chunk (state=3): >>><<< 15980 1727204186.33614: stdout chunk (state=3): >>><<< 15980 1727204186.33638: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204186.33641: _low_level_execute_command(): starting 15980 1727204186.33644: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204186.2146387-19374-129454912903843/AnsiballZ_command.py && sleep 0' 15980 1727204186.34775: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204186.34780: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204186.34783: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204186.52996: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "start": "2024-09-24 14:56:26.510887", "end": "2024-09-24 14:56:26.528586", "delta": "0:00:00.017699", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15980 1727204186.54589: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.47.73 closed. <<< 15980 1727204186.54650: stderr chunk (state=3): >>><<< 15980 1727204186.54654: stdout chunk (state=3): >>><<< 15980 1727204186.54802: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "start": "2024-09-24 14:56:26.510887", "end": "2024-09-24 14:56:26.528586", "delta": "0:00:00.017699", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.47.73 closed. 15980 1727204186.54806: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204186.2146387-19374-129454912903843/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15980 1727204186.54815: _low_level_execute_command(): starting 15980 1727204186.54818: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204186.2146387-19374-129454912903843/ > /dev/null 2>&1 && sleep 0' 15980 1727204186.55569: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204186.55575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204186.55578: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204186.55581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204186.55730: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204186.55777: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204186.55858: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204186.57811: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204186.57945: stderr chunk (state=3): >>><<< 15980 1727204186.57949: stdout chunk (state=3): >>><<< 15980 1727204186.57952: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204186.57957: handler run complete 15980 1727204186.57959: Evaluated conditional (False): False 15980 1727204186.57973: attempt loop complete, returning result 15980 1727204186.57976: _execute() done 15980 1727204186.57979: dumping result to json 15980 1727204186.58001: done dumping result, returning 15980 1727204186.58004: done running TaskExecutor() for managed-node2/TASK: Get NM profile info [127b8e07-fff9-5f1d-4b72-00000000048d] 15980 1727204186.58006: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000048d 15980 1727204186.58137: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000048d 15980 1727204186.58142: WORKER PROCESS EXITING fatal: [managed-node2]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "delta": "0:00:00.017699", "end": "2024-09-24 14:56:26.528586", "rc": 1, "start": "2024-09-24 14:56:26.510887" } MSG: non-zero return code ...ignoring 15980 1727204186.58333: no more pending results, returning what we have 15980 1727204186.58336: results queue empty 15980 1727204186.58337: checking for any_errors_fatal 15980 1727204186.58350: done checking for any_errors_fatal 15980 1727204186.58351: checking for max_fail_percentage 15980 1727204186.58353: done checking for max_fail_percentage 15980 1727204186.58355: checking to see if all hosts have failed and the running result is not ok 15980 1727204186.58356: done checking to see if all hosts have failed 15980 1727204186.58357: getting the remaining hosts for this loop 15980 1727204186.58359: done getting the remaining hosts for this loop 15980 1727204186.58364: getting the next task for host managed-node2 15980 1727204186.58374: done getting next task for host managed-node2 15980 1727204186.58377: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 15980 1727204186.58381: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204186.58385: getting variables 15980 1727204186.58387: in VariableManager get_vars() 15980 1727204186.58419: Calling all_inventory to load vars for managed-node2 15980 1727204186.58422: Calling groups_inventory to load vars for managed-node2 15980 1727204186.58429: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204186.58441: Calling all_plugins_play to load vars for managed-node2 15980 1727204186.58444: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204186.58447: Calling groups_plugins_play to load vars for managed-node2 15980 1727204186.61079: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204186.62993: done with get_vars() 15980 1727204186.63055: done getting variables 15980 1727204186.63152: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:56:26 -0400 (0:00:00.494) 0:00:48.042 ***** 15980 1727204186.63211: entering _queue_task() for managed-node2/set_fact 15980 1727204186.63647: worker is 1 (out of 1 available) 15980 1727204186.63663: exiting _queue_task() for managed-node2/set_fact 15980 1727204186.63679: done queuing things up, now waiting for results queue to drain 15980 1727204186.63681: waiting for pending results... 15980 1727204186.63920: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 15980 1727204186.64127: in run() - task 127b8e07-fff9-5f1d-4b72-00000000048e 15980 1727204186.64132: variable 'ansible_search_path' from source: unknown 15980 1727204186.64135: variable 'ansible_search_path' from source: unknown 15980 1727204186.64374: calling self._execute() 15980 1727204186.64378: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204186.64382: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204186.64386: variable 'omit' from source: magic vars 15980 1727204186.65685: variable 'ansible_distribution_major_version' from source: facts 15980 1727204186.65747: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204186.65978: variable 'nm_profile_exists' from source: set_fact 15980 1727204186.65996: Evaluated conditional (nm_profile_exists.rc == 0): False 15980 1727204186.65999: when evaluation is False, skipping this task 15980 1727204186.66001: _execute() done 15980 1727204186.66004: dumping result to json 15980 1727204186.66006: done dumping result, returning 15980 1727204186.66030: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [127b8e07-fff9-5f1d-4b72-00000000048e] 15980 1727204186.66039: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000048e 15980 1727204186.66166: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000048e 15980 1727204186.66170: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 15980 1727204186.66295: no more pending results, returning what we have 15980 1727204186.66299: results queue empty 15980 1727204186.66300: checking for any_errors_fatal 15980 1727204186.66308: done checking for any_errors_fatal 15980 1727204186.66309: checking for max_fail_percentage 15980 1727204186.66310: done checking for max_fail_percentage 15980 1727204186.66311: checking to see if all hosts have failed and the running result is not ok 15980 1727204186.66312: done checking to see if all hosts have failed 15980 1727204186.66312: getting the remaining hosts for this loop 15980 1727204186.66314: done getting the remaining hosts for this loop 15980 1727204186.66319: getting the next task for host managed-node2 15980 1727204186.66328: done getting next task for host managed-node2 15980 1727204186.66330: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 15980 1727204186.66336: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204186.66344: getting variables 15980 1727204186.66345: in VariableManager get_vars() 15980 1727204186.66379: Calling all_inventory to load vars for managed-node2 15980 1727204186.66382: Calling groups_inventory to load vars for managed-node2 15980 1727204186.66385: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204186.66398: Calling all_plugins_play to load vars for managed-node2 15980 1727204186.66401: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204186.66404: Calling groups_plugins_play to load vars for managed-node2 15980 1727204186.68110: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204186.69938: done with get_vars() 15980 1727204186.69988: done getting variables 15980 1727204186.70079: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15980 1727204186.70242: variable 'profile' from source: play vars 15980 1727204186.70248: variable 'interface' from source: set_fact 15980 1727204186.70327: variable 'interface' from source: set_fact TASK [Get the ansible_managed comment in ifcfg-LSR-TST-br31] ******************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:56:26 -0400 (0:00:00.071) 0:00:48.113 ***** 15980 1727204186.70367: entering _queue_task() for managed-node2/command 15980 1727204186.70829: worker is 1 (out of 1 available) 15980 1727204186.70847: exiting _queue_task() for managed-node2/command 15980 1727204186.70864: done queuing things up, now waiting for results queue to drain 15980 1727204186.71142: waiting for pending results... 15980 1727204186.71652: running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-LSR-TST-br31 15980 1727204186.71838: in run() - task 127b8e07-fff9-5f1d-4b72-000000000490 15980 1727204186.71948: variable 'ansible_search_path' from source: unknown 15980 1727204186.72025: variable 'ansible_search_path' from source: unknown 15980 1727204186.72032: calling self._execute() 15980 1727204186.72036: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204186.72038: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204186.72042: variable 'omit' from source: magic vars 15980 1727204186.72625: variable 'ansible_distribution_major_version' from source: facts 15980 1727204186.72730: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204186.72891: variable 'profile_stat' from source: set_fact 15980 1727204186.72943: Evaluated conditional (profile_stat.stat.exists): False 15980 1727204186.72955: when evaluation is False, skipping this task 15980 1727204186.72962: _execute() done 15980 1727204186.72980: dumping result to json 15980 1727204186.73021: done dumping result, returning 15980 1727204186.73085: done running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-LSR-TST-br31 [127b8e07-fff9-5f1d-4b72-000000000490] 15980 1727204186.73130: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000490 skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15980 1727204186.73480: no more pending results, returning what we have 15980 1727204186.73486: results queue empty 15980 1727204186.73488: checking for any_errors_fatal 15980 1727204186.73498: done checking for any_errors_fatal 15980 1727204186.73500: checking for max_fail_percentage 15980 1727204186.73502: done checking for max_fail_percentage 15980 1727204186.73503: checking to see if all hosts have failed and the running result is not ok 15980 1727204186.73505: done checking to see if all hosts have failed 15980 1727204186.73505: getting the remaining hosts for this loop 15980 1727204186.73509: done getting the remaining hosts for this loop 15980 1727204186.73514: getting the next task for host managed-node2 15980 1727204186.73524: done getting next task for host managed-node2 15980 1727204186.73528: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 15980 1727204186.73533: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204186.73538: getting variables 15980 1727204186.73540: in VariableManager get_vars() 15980 1727204186.73808: Calling all_inventory to load vars for managed-node2 15980 1727204186.73811: Calling groups_inventory to load vars for managed-node2 15980 1727204186.73816: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204186.73834: Calling all_plugins_play to load vars for managed-node2 15980 1727204186.73838: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204186.73842: Calling groups_plugins_play to load vars for managed-node2 15980 1727204186.74453: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000490 15980 1727204186.74457: WORKER PROCESS EXITING 15980 1727204186.76044: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204186.78855: done with get_vars() 15980 1727204186.78902: done getting variables 15980 1727204186.79060: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15980 1727204186.79294: variable 'profile' from source: play vars 15980 1727204186.79317: variable 'interface' from source: set_fact 15980 1727204186.79405: variable 'interface' from source: set_fact TASK [Verify the ansible_managed comment in ifcfg-LSR-TST-br31] **************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:56:26 -0400 (0:00:00.090) 0:00:48.204 ***** 15980 1727204186.79463: entering _queue_task() for managed-node2/set_fact 15980 1727204186.80076: worker is 1 (out of 1 available) 15980 1727204186.80091: exiting _queue_task() for managed-node2/set_fact 15980 1727204186.80107: done queuing things up, now waiting for results queue to drain 15980 1727204186.80109: waiting for pending results... 15980 1727204186.80523: running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-LSR-TST-br31 15980 1727204186.80777: in run() - task 127b8e07-fff9-5f1d-4b72-000000000491 15980 1727204186.80800: variable 'ansible_search_path' from source: unknown 15980 1727204186.80877: variable 'ansible_search_path' from source: unknown 15980 1727204186.81049: calling self._execute() 15980 1727204186.81202: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204186.81290: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204186.81295: variable 'omit' from source: magic vars 15980 1727204186.82836: variable 'ansible_distribution_major_version' from source: facts 15980 1727204186.82950: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204186.83484: variable 'profile_stat' from source: set_fact 15980 1727204186.83508: Evaluated conditional (profile_stat.stat.exists): False 15980 1727204186.83512: when evaluation is False, skipping this task 15980 1727204186.83515: _execute() done 15980 1727204186.83517: dumping result to json 15980 1727204186.83519: done dumping result, returning 15980 1727204186.83521: done running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-LSR-TST-br31 [127b8e07-fff9-5f1d-4b72-000000000491] 15980 1727204186.83524: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000491 skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15980 1727204186.83696: no more pending results, returning what we have 15980 1727204186.83702: results queue empty 15980 1727204186.83703: checking for any_errors_fatal 15980 1727204186.83718: done checking for any_errors_fatal 15980 1727204186.83719: checking for max_fail_percentage 15980 1727204186.83721: done checking for max_fail_percentage 15980 1727204186.83722: checking to see if all hosts have failed and the running result is not ok 15980 1727204186.83723: done checking to see if all hosts have failed 15980 1727204186.83724: getting the remaining hosts for this loop 15980 1727204186.83728: done getting the remaining hosts for this loop 15980 1727204186.83733: getting the next task for host managed-node2 15980 1727204186.83741: done getting next task for host managed-node2 15980 1727204186.83744: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 15980 1727204186.83749: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204186.83754: getting variables 15980 1727204186.83756: in VariableManager get_vars() 15980 1727204186.83899: Calling all_inventory to load vars for managed-node2 15980 1727204186.83902: Calling groups_inventory to load vars for managed-node2 15980 1727204186.83906: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204186.83922: Calling all_plugins_play to load vars for managed-node2 15980 1727204186.83926: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204186.83929: Calling groups_plugins_play to load vars for managed-node2 15980 1727204186.84493: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000491 15980 1727204186.84500: WORKER PROCESS EXITING 15980 1727204186.88752: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204186.93210: done with get_vars() 15980 1727204186.93254: done getting variables 15980 1727204186.93486: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15980 1727204186.94002: variable 'profile' from source: play vars 15980 1727204186.94013: variable 'interface' from source: set_fact 15980 1727204186.94208: variable 'interface' from source: set_fact TASK [Get the fingerprint comment in ifcfg-LSR-TST-br31] *********************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:56:26 -0400 (0:00:00.147) 0:00:48.352 ***** 15980 1727204186.94283: entering _queue_task() for managed-node2/command 15980 1727204186.94997: worker is 1 (out of 1 available) 15980 1727204186.95014: exiting _queue_task() for managed-node2/command 15980 1727204186.95039: done queuing things up, now waiting for results queue to drain 15980 1727204186.95042: waiting for pending results... 15980 1727204186.95303: running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-LSR-TST-br31 15980 1727204186.95455: in run() - task 127b8e07-fff9-5f1d-4b72-000000000492 15980 1727204186.95481: variable 'ansible_search_path' from source: unknown 15980 1727204186.95491: variable 'ansible_search_path' from source: unknown 15980 1727204186.95541: calling self._execute() 15980 1727204186.95771: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204186.95775: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204186.95779: variable 'omit' from source: magic vars 15980 1727204186.96094: variable 'ansible_distribution_major_version' from source: facts 15980 1727204186.96114: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204186.96250: variable 'profile_stat' from source: set_fact 15980 1727204186.96274: Evaluated conditional (profile_stat.stat.exists): False 15980 1727204186.96281: when evaluation is False, skipping this task 15980 1727204186.96287: _execute() done 15980 1727204186.96294: dumping result to json 15980 1727204186.96305: done dumping result, returning 15980 1727204186.96315: done running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-LSR-TST-br31 [127b8e07-fff9-5f1d-4b72-000000000492] 15980 1727204186.96325: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000492 skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15980 1727204186.96497: no more pending results, returning what we have 15980 1727204186.96502: results queue empty 15980 1727204186.96503: checking for any_errors_fatal 15980 1727204186.96632: done checking for any_errors_fatal 15980 1727204186.96634: checking for max_fail_percentage 15980 1727204186.96636: done checking for max_fail_percentage 15980 1727204186.96637: checking to see if all hosts have failed and the running result is not ok 15980 1727204186.96638: done checking to see if all hosts have failed 15980 1727204186.96639: getting the remaining hosts for this loop 15980 1727204186.96641: done getting the remaining hosts for this loop 15980 1727204186.96645: getting the next task for host managed-node2 15980 1727204186.96652: done getting next task for host managed-node2 15980 1727204186.96655: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 15980 1727204186.96660: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204186.96663: getting variables 15980 1727204186.96667: in VariableManager get_vars() 15980 1727204186.96697: Calling all_inventory to load vars for managed-node2 15980 1727204186.96700: Calling groups_inventory to load vars for managed-node2 15980 1727204186.96703: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204186.96711: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000492 15980 1727204186.96716: WORKER PROCESS EXITING 15980 1727204186.96729: Calling all_plugins_play to load vars for managed-node2 15980 1727204186.96733: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204186.96741: Calling groups_plugins_play to load vars for managed-node2 15980 1727204186.98685: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204187.00945: done with get_vars() 15980 1727204187.00997: done getting variables 15980 1727204187.01071: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15980 1727204187.01219: variable 'profile' from source: play vars 15980 1727204187.01224: variable 'interface' from source: set_fact 15980 1727204187.01298: variable 'interface' from source: set_fact TASK [Verify the fingerprint comment in ifcfg-LSR-TST-br31] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:56:27 -0400 (0:00:00.070) 0:00:48.423 ***** 15980 1727204187.01334: entering _queue_task() for managed-node2/set_fact 15980 1727204187.01740: worker is 1 (out of 1 available) 15980 1727204187.01756: exiting _queue_task() for managed-node2/set_fact 15980 1727204187.01971: done queuing things up, now waiting for results queue to drain 15980 1727204187.01974: waiting for pending results... 15980 1727204187.02097: running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-LSR-TST-br31 15980 1727204187.02245: in run() - task 127b8e07-fff9-5f1d-4b72-000000000493 15980 1727204187.02269: variable 'ansible_search_path' from source: unknown 15980 1727204187.02279: variable 'ansible_search_path' from source: unknown 15980 1727204187.02329: calling self._execute() 15980 1727204187.02441: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204187.02454: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204187.02473: variable 'omit' from source: magic vars 15980 1727204187.02884: variable 'ansible_distribution_major_version' from source: facts 15980 1727204187.02902: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204187.03038: variable 'profile_stat' from source: set_fact 15980 1727204187.03060: Evaluated conditional (profile_stat.stat.exists): False 15980 1727204187.03071: when evaluation is False, skipping this task 15980 1727204187.03081: _execute() done 15980 1727204187.03088: dumping result to json 15980 1727204187.03097: done dumping result, returning 15980 1727204187.03107: done running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-LSR-TST-br31 [127b8e07-fff9-5f1d-4b72-000000000493] 15980 1727204187.03117: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000493 skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15980 1727204187.03363: no more pending results, returning what we have 15980 1727204187.03369: results queue empty 15980 1727204187.03371: checking for any_errors_fatal 15980 1727204187.03379: done checking for any_errors_fatal 15980 1727204187.03379: checking for max_fail_percentage 15980 1727204187.03381: done checking for max_fail_percentage 15980 1727204187.03382: checking to see if all hosts have failed and the running result is not ok 15980 1727204187.03383: done checking to see if all hosts have failed 15980 1727204187.03384: getting the remaining hosts for this loop 15980 1727204187.03386: done getting the remaining hosts for this loop 15980 1727204187.03390: getting the next task for host managed-node2 15980 1727204187.03400: done getting next task for host managed-node2 15980 1727204187.03402: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 15980 1727204187.03406: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204187.03411: getting variables 15980 1727204187.03413: in VariableManager get_vars() 15980 1727204187.03448: Calling all_inventory to load vars for managed-node2 15980 1727204187.03451: Calling groups_inventory to load vars for managed-node2 15980 1727204187.03455: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204187.03880: Calling all_plugins_play to load vars for managed-node2 15980 1727204187.03885: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204187.03890: Calling groups_plugins_play to load vars for managed-node2 15980 1727204187.04774: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000493 15980 1727204187.04779: WORKER PROCESS EXITING 15980 1727204187.05995: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204187.07202: done with get_vars() 15980 1727204187.07237: done getting variables 15980 1727204187.07292: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15980 1727204187.07403: variable 'profile' from source: play vars 15980 1727204187.07408: variable 'interface' from source: set_fact 15980 1727204187.07476: variable 'interface' from source: set_fact TASK [Assert that the profile is absent - 'LSR-TST-br31'] ********************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Tuesday 24 September 2024 14:56:27 -0400 (0:00:00.061) 0:00:48.485 ***** 15980 1727204187.07507: entering _queue_task() for managed-node2/assert 15980 1727204187.08069: worker is 1 (out of 1 available) 15980 1727204187.08083: exiting _queue_task() for managed-node2/assert 15980 1727204187.08097: done queuing things up, now waiting for results queue to drain 15980 1727204187.08100: waiting for pending results... 15980 1727204187.08234: running TaskExecutor() for managed-node2/TASK: Assert that the profile is absent - 'LSR-TST-br31' 15980 1727204187.08344: in run() - task 127b8e07-fff9-5f1d-4b72-000000000480 15980 1727204187.08356: variable 'ansible_search_path' from source: unknown 15980 1727204187.08361: variable 'ansible_search_path' from source: unknown 15980 1727204187.08400: calling self._execute() 15980 1727204187.08517: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204187.08522: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204187.08527: variable 'omit' from source: magic vars 15980 1727204187.08958: variable 'ansible_distribution_major_version' from source: facts 15980 1727204187.08962: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204187.08978: variable 'omit' from source: magic vars 15980 1727204187.09066: variable 'omit' from source: magic vars 15980 1727204187.09182: variable 'profile' from source: play vars 15980 1727204187.09192: variable 'interface' from source: set_fact 15980 1727204187.09288: variable 'interface' from source: set_fact 15980 1727204187.09394: variable 'omit' from source: magic vars 15980 1727204187.09398: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204187.09420: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204187.09450: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204187.09479: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204187.09505: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204187.09555: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204187.09567: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204187.09575: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204187.09698: Set connection var ansible_connection to ssh 15980 1727204187.09710: Set connection var ansible_pipelining to False 15980 1727204187.09736: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204187.09739: Set connection var ansible_timeout to 10 15980 1727204187.09750: Set connection var ansible_shell_type to sh 15980 1727204187.09754: Set connection var ansible_shell_executable to /bin/sh 15980 1727204187.09781: variable 'ansible_shell_executable' from source: unknown 15980 1727204187.09784: variable 'ansible_connection' from source: unknown 15980 1727204187.09787: variable 'ansible_module_compression' from source: unknown 15980 1727204187.09789: variable 'ansible_shell_type' from source: unknown 15980 1727204187.09792: variable 'ansible_shell_executable' from source: unknown 15980 1727204187.09794: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204187.09799: variable 'ansible_pipelining' from source: unknown 15980 1727204187.09802: variable 'ansible_timeout' from source: unknown 15980 1727204187.09804: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204187.09932: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15980 1727204187.09951: variable 'omit' from source: magic vars 15980 1727204187.09961: starting attempt loop 15980 1727204187.09964: running the handler 15980 1727204187.10074: variable 'lsr_net_profile_exists' from source: set_fact 15980 1727204187.10080: Evaluated conditional (not lsr_net_profile_exists): True 15980 1727204187.10086: handler run complete 15980 1727204187.10099: attempt loop complete, returning result 15980 1727204187.10102: _execute() done 15980 1727204187.10104: dumping result to json 15980 1727204187.10107: done dumping result, returning 15980 1727204187.10114: done running TaskExecutor() for managed-node2/TASK: Assert that the profile is absent - 'LSR-TST-br31' [127b8e07-fff9-5f1d-4b72-000000000480] 15980 1727204187.10119: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000480 15980 1727204187.10212: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000480 15980 1727204187.10215: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 15980 1727204187.10278: no more pending results, returning what we have 15980 1727204187.10281: results queue empty 15980 1727204187.10282: checking for any_errors_fatal 15980 1727204187.10291: done checking for any_errors_fatal 15980 1727204187.10291: checking for max_fail_percentage 15980 1727204187.10293: done checking for max_fail_percentage 15980 1727204187.10295: checking to see if all hosts have failed and the running result is not ok 15980 1727204187.10296: done checking to see if all hosts have failed 15980 1727204187.10297: getting the remaining hosts for this loop 15980 1727204187.10299: done getting the remaining hosts for this loop 15980 1727204187.10304: getting the next task for host managed-node2 15980 1727204187.10313: done getting next task for host managed-node2 15980 1727204187.10315: ^ task is: TASK: meta (flush_handlers) 15980 1727204187.10317: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204187.10323: getting variables 15980 1727204187.10332: in VariableManager get_vars() 15980 1727204187.10364: Calling all_inventory to load vars for managed-node2 15980 1727204187.10369: Calling groups_inventory to load vars for managed-node2 15980 1727204187.10372: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204187.10384: Calling all_plugins_play to load vars for managed-node2 15980 1727204187.10387: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204187.10389: Calling groups_plugins_play to load vars for managed-node2 15980 1727204187.11546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204187.13248: done with get_vars() 15980 1727204187.13286: done getting variables 15980 1727204187.13346: in VariableManager get_vars() 15980 1727204187.13355: Calling all_inventory to load vars for managed-node2 15980 1727204187.13357: Calling groups_inventory to load vars for managed-node2 15980 1727204187.13358: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204187.13362: Calling all_plugins_play to load vars for managed-node2 15980 1727204187.13364: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204187.13368: Calling groups_plugins_play to load vars for managed-node2 15980 1727204187.14322: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204187.16047: done with get_vars() 15980 1727204187.16095: done queuing things up, now waiting for results queue to drain 15980 1727204187.16097: results queue empty 15980 1727204187.16098: checking for any_errors_fatal 15980 1727204187.16102: done checking for any_errors_fatal 15980 1727204187.16102: checking for max_fail_percentage 15980 1727204187.16103: done checking for max_fail_percentage 15980 1727204187.16104: checking to see if all hosts have failed and the running result is not ok 15980 1727204187.16112: done checking to see if all hosts have failed 15980 1727204187.16113: getting the remaining hosts for this loop 15980 1727204187.16114: done getting the remaining hosts for this loop 15980 1727204187.16117: getting the next task for host managed-node2 15980 1727204187.16121: done getting next task for host managed-node2 15980 1727204187.16123: ^ task is: TASK: meta (flush_handlers) 15980 1727204187.16124: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204187.16127: getting variables 15980 1727204187.16129: in VariableManager get_vars() 15980 1727204187.16140: Calling all_inventory to load vars for managed-node2 15980 1727204187.16142: Calling groups_inventory to load vars for managed-node2 15980 1727204187.16145: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204187.16150: Calling all_plugins_play to load vars for managed-node2 15980 1727204187.16152: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204187.16155: Calling groups_plugins_play to load vars for managed-node2 15980 1727204187.17667: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204187.19792: done with get_vars() 15980 1727204187.19831: done getting variables 15980 1727204187.19895: in VariableManager get_vars() 15980 1727204187.19907: Calling all_inventory to load vars for managed-node2 15980 1727204187.19910: Calling groups_inventory to load vars for managed-node2 15980 1727204187.19912: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204187.19918: Calling all_plugins_play to load vars for managed-node2 15980 1727204187.19920: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204187.19923: Calling groups_plugins_play to load vars for managed-node2 15980 1727204187.27375: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204187.29552: done with get_vars() 15980 1727204187.29600: done queuing things up, now waiting for results queue to drain 15980 1727204187.29602: results queue empty 15980 1727204187.29603: checking for any_errors_fatal 15980 1727204187.29604: done checking for any_errors_fatal 15980 1727204187.29605: checking for max_fail_percentage 15980 1727204187.29607: done checking for max_fail_percentage 15980 1727204187.29608: checking to see if all hosts have failed and the running result is not ok 15980 1727204187.29609: done checking to see if all hosts have failed 15980 1727204187.29610: getting the remaining hosts for this loop 15980 1727204187.29611: done getting the remaining hosts for this loop 15980 1727204187.29614: getting the next task for host managed-node2 15980 1727204187.29618: done getting next task for host managed-node2 15980 1727204187.29619: ^ task is: None 15980 1727204187.29620: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204187.29622: done queuing things up, now waiting for results queue to drain 15980 1727204187.29622: results queue empty 15980 1727204187.29623: checking for any_errors_fatal 15980 1727204187.29624: done checking for any_errors_fatal 15980 1727204187.29627: checking for max_fail_percentage 15980 1727204187.29628: done checking for max_fail_percentage 15980 1727204187.29629: checking to see if all hosts have failed and the running result is not ok 15980 1727204187.29630: done checking to see if all hosts have failed 15980 1727204187.29631: getting the next task for host managed-node2 15980 1727204187.29633: done getting next task for host managed-node2 15980 1727204187.29634: ^ task is: None 15980 1727204187.29635: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204187.29835: in VariableManager get_vars() 15980 1727204187.29854: done with get_vars() 15980 1727204187.29861: in VariableManager get_vars() 15980 1727204187.29873: done with get_vars() 15980 1727204187.29877: variable 'omit' from source: magic vars 15980 1727204187.29981: variable 'task' from source: play vars 15980 1727204187.30012: in VariableManager get_vars() 15980 1727204187.30026: done with get_vars() 15980 1727204187.30046: variable 'omit' from source: magic vars PLAY [Run the tasklist tasks/assert_device_absent.yml] ************************* 15980 1727204187.30839: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15980 1727204187.30893: getting the remaining hosts for this loop 15980 1727204187.30895: done getting the remaining hosts for this loop 15980 1727204187.30898: getting the next task for host managed-node2 15980 1727204187.30901: done getting next task for host managed-node2 15980 1727204187.30903: ^ task is: TASK: Gathering Facts 15980 1727204187.30905: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204187.30907: getting variables 15980 1727204187.30908: in VariableManager get_vars() 15980 1727204187.30919: Calling all_inventory to load vars for managed-node2 15980 1727204187.30922: Calling groups_inventory to load vars for managed-node2 15980 1727204187.30928: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204187.30934: Calling all_plugins_play to load vars for managed-node2 15980 1727204187.30937: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204187.30940: Calling groups_plugins_play to load vars for managed-node2 15980 1727204187.32979: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204187.35664: done with get_vars() 15980 1727204187.35702: done getting variables 15980 1727204187.35753: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Tuesday 24 September 2024 14:56:27 -0400 (0:00:00.284) 0:00:48.770 ***** 15980 1727204187.35986: entering _queue_task() for managed-node2/gather_facts 15980 1727204187.36769: worker is 1 (out of 1 available) 15980 1727204187.36782: exiting _queue_task() for managed-node2/gather_facts 15980 1727204187.36794: done queuing things up, now waiting for results queue to drain 15980 1727204187.36796: waiting for pending results... 15980 1727204187.37377: running TaskExecutor() for managed-node2/TASK: Gathering Facts 15980 1727204187.37672: in run() - task 127b8e07-fff9-5f1d-4b72-0000000004c5 15980 1727204187.37677: variable 'ansible_search_path' from source: unknown 15980 1727204187.37680: calling self._execute() 15980 1727204187.37683: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204187.37697: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204187.37712: variable 'omit' from source: magic vars 15980 1727204187.38168: variable 'ansible_distribution_major_version' from source: facts 15980 1727204187.38188: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204187.38199: variable 'omit' from source: magic vars 15980 1727204187.38241: variable 'omit' from source: magic vars 15980 1727204187.38287: variable 'omit' from source: magic vars 15980 1727204187.38343: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204187.38391: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204187.38422: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204187.38454: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204187.38473: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204187.38508: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204187.38517: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204187.38528: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204187.38654: Set connection var ansible_connection to ssh 15980 1727204187.38783: Set connection var ansible_pipelining to False 15980 1727204187.38787: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204187.38789: Set connection var ansible_timeout to 10 15980 1727204187.38792: Set connection var ansible_shell_type to sh 15980 1727204187.38794: Set connection var ansible_shell_executable to /bin/sh 15980 1727204187.38797: variable 'ansible_shell_executable' from source: unknown 15980 1727204187.38800: variable 'ansible_connection' from source: unknown 15980 1727204187.38802: variable 'ansible_module_compression' from source: unknown 15980 1727204187.38805: variable 'ansible_shell_type' from source: unknown 15980 1727204187.38808: variable 'ansible_shell_executable' from source: unknown 15980 1727204187.38811: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204187.38813: variable 'ansible_pipelining' from source: unknown 15980 1727204187.38816: variable 'ansible_timeout' from source: unknown 15980 1727204187.38819: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204187.38991: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15980 1727204187.39015: variable 'omit' from source: magic vars 15980 1727204187.39071: starting attempt loop 15980 1727204187.39075: running the handler 15980 1727204187.39077: variable 'ansible_facts' from source: unknown 15980 1727204187.39081: _low_level_execute_command(): starting 15980 1727204187.39093: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15980 1727204187.40199: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204187.40215: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204187.40287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204187.40342: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204187.40357: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204187.40394: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204187.40635: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204187.43041: stdout chunk (state=3): >>>/root <<< 15980 1727204187.43045: stdout chunk (state=3): >>><<< 15980 1727204187.43048: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204187.43050: stderr chunk (state=3): >>><<< 15980 1727204187.43054: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204187.43056: _low_level_execute_command(): starting 15980 1727204187.43059: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204187.4293592-19424-134026617967472 `" && echo ansible-tmp-1727204187.4293592-19424-134026617967472="` echo /root/.ansible/tmp/ansible-tmp-1727204187.4293592-19424-134026617967472 `" ) && sleep 0' 15980 1727204187.44647: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204187.44686: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204187.44701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204187.44982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204187.45010: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204187.45169: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204187.47191: stdout chunk (state=3): >>>ansible-tmp-1727204187.4293592-19424-134026617967472=/root/.ansible/tmp/ansible-tmp-1727204187.4293592-19424-134026617967472 <<< 15980 1727204187.47361: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204187.47375: stdout chunk (state=3): >>><<< 15980 1727204187.47575: stderr chunk (state=3): >>><<< 15980 1727204187.47580: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204187.4293592-19424-134026617967472=/root/.ansible/tmp/ansible-tmp-1727204187.4293592-19424-134026617967472 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204187.47583: variable 'ansible_module_compression' from source: unknown 15980 1727204187.47642: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15980vtkmnsww/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15980 1727204187.47927: variable 'ansible_facts' from source: unknown 15980 1727204187.48254: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204187.4293592-19424-134026617967472/AnsiballZ_setup.py 15980 1727204187.48745: Sending initial data 15980 1727204187.48837: Sent initial data (154 bytes) 15980 1727204187.50180: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204187.50235: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204187.50253: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204187.50355: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204187.52033: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15980 1727204187.52093: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15980 1727204187.52173: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15980vtkmnsww/tmpuo2gsdmq /root/.ansible/tmp/ansible-tmp-1727204187.4293592-19424-134026617967472/AnsiballZ_setup.py <<< 15980 1727204187.52177: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204187.4293592-19424-134026617967472/AnsiballZ_setup.py" <<< 15980 1727204187.52341: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15980vtkmnsww/tmpuo2gsdmq" to remote "/root/.ansible/tmp/ansible-tmp-1727204187.4293592-19424-134026617967472/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204187.4293592-19424-134026617967472/AnsiballZ_setup.py" <<< 15980 1727204187.55210: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204187.55475: stderr chunk (state=3): >>><<< 15980 1727204187.55479: stdout chunk (state=3): >>><<< 15980 1727204187.55673: done transferring module to remote 15980 1727204187.55677: _low_level_execute_command(): starting 15980 1727204187.55680: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204187.4293592-19424-134026617967472/ /root/.ansible/tmp/ansible-tmp-1727204187.4293592-19424-134026617967472/AnsiballZ_setup.py && sleep 0' 15980 1727204187.56305: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204187.56338: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204187.56381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204187.56460: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204187.56483: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204187.56498: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204187.56663: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204187.58873: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204187.58878: stdout chunk (state=3): >>><<< 15980 1727204187.58880: stderr chunk (state=3): >>><<< 15980 1727204187.58883: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204187.58892: _low_level_execute_command(): starting 15980 1727204187.58895: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204187.4293592-19424-134026617967472/AnsiballZ_setup.py && sleep 0' 15980 1727204187.60793: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204187.60899: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204187.61024: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204187.61117: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204187.61222: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204187.61376: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204188.26626: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_iscsi_iqn": "", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_loadavg": {"1m": 0.7236328125, "5m": 0.53466796875, "15m": 0.26904296875}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3047, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 669, "free": 3047}, "nocache": {"free": 3477, "used": 239}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_uuid": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "lab<<< 15980 1727204188.26657: stdout chunk (state=3): >>>els": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 534, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251325526016, "block_size": 4096, "block_total": 64479564, "block_available": 61358771, "block_used": 3120793, "inode_total": 16384000, "inode_available": 16301510, "inode_used": 82490, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::f7:13ff:fe22:8fc1", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.73"], "ansible_all_ipv6_addresses": ["fe80::f7:13ff:fe22:8fc1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.73", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::f7:13ff:fe22:8fc1"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_is_chroot": false, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_fips": false, "ansible_pkg_mgr": "dnf", "ansible_lsb": {}, "ansible_service_mgr": "systemd", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "28", "epoch": "1727204188", "epoch_int": "1727204188", "date": "2024-09-24", "time": "14:56:28", "iso8601_micro": "2024-09-24T18:56:28.262872Z", "iso8601": "2024-09-24T18:56:28Z", "iso8601_basic": "20240924T145628262872", "iso8601_basic_short": "20240924T145628", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15980 1727204188.28686: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 15980 1727204188.28784: stderr chunk (state=3): >>><<< 15980 1727204188.28787: stdout chunk (state=3): >>><<< 15980 1727204188.28817: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_iscsi_iqn": "", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_loadavg": {"1m": 0.7236328125, "5m": 0.53466796875, "15m": 0.26904296875}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3047, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 669, "free": 3047}, "nocache": {"free": 3477, "used": 239}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_uuid": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 534, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251325526016, "block_size": 4096, "block_total": 64479564, "block_available": 61358771, "block_used": 3120793, "inode_total": 16384000, "inode_available": 16301510, "inode_used": 82490, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::f7:13ff:fe22:8fc1", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.73"], "ansible_all_ipv6_addresses": ["fe80::f7:13ff:fe22:8fc1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.73", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::f7:13ff:fe22:8fc1"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_is_chroot": false, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_fips": false, "ansible_pkg_mgr": "dnf", "ansible_lsb": {}, "ansible_service_mgr": "systemd", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "28", "epoch": "1727204188", "epoch_int": "1727204188", "date": "2024-09-24", "time": "14:56:28", "iso8601_micro": "2024-09-24T18:56:28.262872Z", "iso8601": "2024-09-24T18:56:28Z", "iso8601_basic": "20240924T145628262872", "iso8601_basic_short": "20240924T145628", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 15980 1727204188.29064: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204187.4293592-19424-134026617967472/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15980 1727204188.29087: _low_level_execute_command(): starting 15980 1727204188.29091: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204187.4293592-19424-134026617967472/ > /dev/null 2>&1 && sleep 0' 15980 1727204188.29664: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 15980 1727204188.29677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204188.29735: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204188.29739: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204188.29769: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204188.29844: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204188.31799: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204188.31905: stderr chunk (state=3): >>><<< 15980 1727204188.31910: stdout chunk (state=3): >>><<< 15980 1727204188.32064: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204188.32069: handler run complete 15980 1727204188.32101: variable 'ansible_facts' from source: unknown 15980 1727204188.32179: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204188.32396: variable 'ansible_facts' from source: unknown 15980 1727204188.32473: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204188.32578: attempt loop complete, returning result 15980 1727204188.32592: _execute() done 15980 1727204188.32596: dumping result to json 15980 1727204188.32638: done dumping result, returning 15980 1727204188.32641: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [127b8e07-fff9-5f1d-4b72-0000000004c5] 15980 1727204188.32644: sending task result for task 127b8e07-fff9-5f1d-4b72-0000000004c5 ok: [managed-node2] 15980 1727204188.33398: no more pending results, returning what we have 15980 1727204188.33400: results queue empty 15980 1727204188.33401: checking for any_errors_fatal 15980 1727204188.33401: done checking for any_errors_fatal 15980 1727204188.33402: checking for max_fail_percentage 15980 1727204188.33403: done checking for max_fail_percentage 15980 1727204188.33404: checking to see if all hosts have failed and the running result is not ok 15980 1727204188.33404: done checking to see if all hosts have failed 15980 1727204188.33405: getting the remaining hosts for this loop 15980 1727204188.33406: done getting the remaining hosts for this loop 15980 1727204188.33410: getting the next task for host managed-node2 15980 1727204188.33414: done getting next task for host managed-node2 15980 1727204188.33416: ^ task is: TASK: meta (flush_handlers) 15980 1727204188.33417: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204188.33426: getting variables 15980 1727204188.33430: in VariableManager get_vars() 15980 1727204188.33463: Calling all_inventory to load vars for managed-node2 15980 1727204188.33470: Calling groups_inventory to load vars for managed-node2 15980 1727204188.33477: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204188.33493: Calling all_plugins_play to load vars for managed-node2 15980 1727204188.33499: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204188.33506: Calling groups_plugins_play to load vars for managed-node2 15980 1727204188.34085: done sending task result for task 127b8e07-fff9-5f1d-4b72-0000000004c5 15980 1727204188.34091: WORKER PROCESS EXITING 15980 1727204188.35212: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204188.37018: done with get_vars() 15980 1727204188.37039: done getting variables 15980 1727204188.37102: in VariableManager get_vars() 15980 1727204188.37110: Calling all_inventory to load vars for managed-node2 15980 1727204188.37112: Calling groups_inventory to load vars for managed-node2 15980 1727204188.37114: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204188.37117: Calling all_plugins_play to load vars for managed-node2 15980 1727204188.37119: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204188.37121: Calling groups_plugins_play to load vars for managed-node2 15980 1727204188.38152: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204188.40358: done with get_vars() 15980 1727204188.40406: done queuing things up, now waiting for results queue to drain 15980 1727204188.40410: results queue empty 15980 1727204188.40411: checking for any_errors_fatal 15980 1727204188.40414: done checking for any_errors_fatal 15980 1727204188.40415: checking for max_fail_percentage 15980 1727204188.40416: done checking for max_fail_percentage 15980 1727204188.40416: checking to see if all hosts have failed and the running result is not ok 15980 1727204188.40421: done checking to see if all hosts have failed 15980 1727204188.40422: getting the remaining hosts for this loop 15980 1727204188.40423: done getting the remaining hosts for this loop 15980 1727204188.40427: getting the next task for host managed-node2 15980 1727204188.40430: done getting next task for host managed-node2 15980 1727204188.40432: ^ task is: TASK: Include the task '{{ task }}' 15980 1727204188.40433: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204188.40435: getting variables 15980 1727204188.40435: in VariableManager get_vars() 15980 1727204188.40443: Calling all_inventory to load vars for managed-node2 15980 1727204188.40445: Calling groups_inventory to load vars for managed-node2 15980 1727204188.40447: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204188.40453: Calling all_plugins_play to load vars for managed-node2 15980 1727204188.40455: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204188.40457: Calling groups_plugins_play to load vars for managed-node2 15980 1727204188.41873: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204188.44526: done with get_vars() 15980 1727204188.44570: done getting variables 15980 1727204188.44730: variable 'task' from source: play vars TASK [Include the task 'tasks/assert_device_absent.yml'] *********************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:6 Tuesday 24 September 2024 14:56:28 -0400 (0:00:01.087) 0:00:49.857 ***** 15980 1727204188.44767: entering _queue_task() for managed-node2/include_tasks 15980 1727204188.45091: worker is 1 (out of 1 available) 15980 1727204188.45110: exiting _queue_task() for managed-node2/include_tasks 15980 1727204188.45124: done queuing things up, now waiting for results queue to drain 15980 1727204188.45128: waiting for pending results... 15980 1727204188.45370: running TaskExecutor() for managed-node2/TASK: Include the task 'tasks/assert_device_absent.yml' 15980 1727204188.45451: in run() - task 127b8e07-fff9-5f1d-4b72-000000000077 15980 1727204188.45464: variable 'ansible_search_path' from source: unknown 15980 1727204188.45503: calling self._execute() 15980 1727204188.45591: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204188.45596: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204188.45612: variable 'omit' from source: magic vars 15980 1727204188.45936: variable 'ansible_distribution_major_version' from source: facts 15980 1727204188.45947: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204188.45955: variable 'task' from source: play vars 15980 1727204188.46008: variable 'task' from source: play vars 15980 1727204188.46015: _execute() done 15980 1727204188.46018: dumping result to json 15980 1727204188.46021: done dumping result, returning 15980 1727204188.46030: done running TaskExecutor() for managed-node2/TASK: Include the task 'tasks/assert_device_absent.yml' [127b8e07-fff9-5f1d-4b72-000000000077] 15980 1727204188.46038: sending task result for task 127b8e07-fff9-5f1d-4b72-000000000077 15980 1727204188.46150: done sending task result for task 127b8e07-fff9-5f1d-4b72-000000000077 15980 1727204188.46153: WORKER PROCESS EXITING 15980 1727204188.46184: no more pending results, returning what we have 15980 1727204188.46189: in VariableManager get_vars() 15980 1727204188.46228: Calling all_inventory to load vars for managed-node2 15980 1727204188.46233: Calling groups_inventory to load vars for managed-node2 15980 1727204188.46236: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204188.46252: Calling all_plugins_play to load vars for managed-node2 15980 1727204188.46255: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204188.46257: Calling groups_plugins_play to load vars for managed-node2 15980 1727204188.47352: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204188.49972: done with get_vars() 15980 1727204188.49999: variable 'ansible_search_path' from source: unknown 15980 1727204188.50018: we have included files to process 15980 1727204188.50020: generating all_blocks data 15980 1727204188.50021: done generating all_blocks data 15980 1727204188.50022: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 15980 1727204188.50023: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 15980 1727204188.50029: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 15980 1727204188.50174: in VariableManager get_vars() 15980 1727204188.50194: done with get_vars() 15980 1727204188.50428: done processing included file 15980 1727204188.50430: iterating over new_blocks loaded from include file 15980 1727204188.50458: in VariableManager get_vars() 15980 1727204188.50503: done with get_vars() 15980 1727204188.50510: filtering new block on tags 15980 1727204188.50538: done filtering new block on tags 15980 1727204188.50541: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed-node2 15980 1727204188.50555: extending task lists for all hosts with included blocks 15980 1727204188.50693: done extending task lists 15980 1727204188.50698: done processing included files 15980 1727204188.50699: results queue empty 15980 1727204188.50699: checking for any_errors_fatal 15980 1727204188.50704: done checking for any_errors_fatal 15980 1727204188.50707: checking for max_fail_percentage 15980 1727204188.50709: done checking for max_fail_percentage 15980 1727204188.50710: checking to see if all hosts have failed and the running result is not ok 15980 1727204188.50711: done checking to see if all hosts have failed 15980 1727204188.50711: getting the remaining hosts for this loop 15980 1727204188.50716: done getting the remaining hosts for this loop 15980 1727204188.50721: getting the next task for host managed-node2 15980 1727204188.50729: done getting next task for host managed-node2 15980 1727204188.50732: ^ task is: TASK: Include the task 'get_interface_stat.yml' 15980 1727204188.50737: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204188.50740: getting variables 15980 1727204188.50741: in VariableManager get_vars() 15980 1727204188.50761: Calling all_inventory to load vars for managed-node2 15980 1727204188.50771: Calling groups_inventory to load vars for managed-node2 15980 1727204188.50774: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204188.50783: Calling all_plugins_play to load vars for managed-node2 15980 1727204188.50786: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204188.50789: Calling groups_plugins_play to load vars for managed-node2 15980 1727204188.53145: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204188.56217: done with get_vars() 15980 1727204188.56261: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Tuesday 24 September 2024 14:56:28 -0400 (0:00:00.115) 0:00:49.973 ***** 15980 1727204188.56360: entering _queue_task() for managed-node2/include_tasks 15980 1727204188.57142: worker is 1 (out of 1 available) 15980 1727204188.57156: exiting _queue_task() for managed-node2/include_tasks 15980 1727204188.57372: done queuing things up, now waiting for results queue to drain 15980 1727204188.57375: waiting for pending results... 15980 1727204188.57814: running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' 15980 1727204188.58176: in run() - task 127b8e07-fff9-5f1d-4b72-0000000004d6 15980 1727204188.58180: variable 'ansible_search_path' from source: unknown 15980 1727204188.58182: variable 'ansible_search_path' from source: unknown 15980 1727204188.58186: calling self._execute() 15980 1727204188.58331: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204188.58343: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204188.58357: variable 'omit' from source: magic vars 15980 1727204188.58791: variable 'ansible_distribution_major_version' from source: facts 15980 1727204188.58814: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204188.58836: _execute() done 15980 1727204188.58846: dumping result to json 15980 1727204188.58854: done dumping result, returning 15980 1727204188.58867: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' [127b8e07-fff9-5f1d-4b72-0000000004d6] 15980 1727204188.58877: sending task result for task 127b8e07-fff9-5f1d-4b72-0000000004d6 15980 1727204188.59097: no more pending results, returning what we have 15980 1727204188.59104: in VariableManager get_vars() 15980 1727204188.59145: Calling all_inventory to load vars for managed-node2 15980 1727204188.59148: Calling groups_inventory to load vars for managed-node2 15980 1727204188.59152: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204188.59172: Calling all_plugins_play to load vars for managed-node2 15980 1727204188.59175: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204188.59179: Calling groups_plugins_play to load vars for managed-node2 15980 1727204188.59786: done sending task result for task 127b8e07-fff9-5f1d-4b72-0000000004d6 15980 1727204188.59791: WORKER PROCESS EXITING 15980 1727204188.61392: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204188.63580: done with get_vars() 15980 1727204188.63616: variable 'ansible_search_path' from source: unknown 15980 1727204188.63618: variable 'ansible_search_path' from source: unknown 15980 1727204188.63632: variable 'task' from source: play vars 15980 1727204188.63752: variable 'task' from source: play vars 15980 1727204188.63797: we have included files to process 15980 1727204188.63798: generating all_blocks data 15980 1727204188.63800: done generating all_blocks data 15980 1727204188.63801: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15980 1727204188.63803: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15980 1727204188.63805: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15980 1727204188.64010: done processing included file 15980 1727204188.64012: iterating over new_blocks loaded from include file 15980 1727204188.64014: in VariableManager get_vars() 15980 1727204188.64031: done with get_vars() 15980 1727204188.64033: filtering new block on tags 15980 1727204188.64049: done filtering new block on tags 15980 1727204188.64052: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node2 15980 1727204188.64057: extending task lists for all hosts with included blocks 15980 1727204188.64175: done extending task lists 15980 1727204188.64176: done processing included files 15980 1727204188.64177: results queue empty 15980 1727204188.64178: checking for any_errors_fatal 15980 1727204188.64182: done checking for any_errors_fatal 15980 1727204188.64183: checking for max_fail_percentage 15980 1727204188.64184: done checking for max_fail_percentage 15980 1727204188.64185: checking to see if all hosts have failed and the running result is not ok 15980 1727204188.64186: done checking to see if all hosts have failed 15980 1727204188.64187: getting the remaining hosts for this loop 15980 1727204188.64188: done getting the remaining hosts for this loop 15980 1727204188.64191: getting the next task for host managed-node2 15980 1727204188.64195: done getting next task for host managed-node2 15980 1727204188.64197: ^ task is: TASK: Get stat for interface {{ interface }} 15980 1727204188.64200: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204188.64203: getting variables 15980 1727204188.64204: in VariableManager get_vars() 15980 1727204188.64214: Calling all_inventory to load vars for managed-node2 15980 1727204188.64216: Calling groups_inventory to load vars for managed-node2 15980 1727204188.64219: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204188.64227: Calling all_plugins_play to load vars for managed-node2 15980 1727204188.64230: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204188.64233: Calling groups_plugins_play to load vars for managed-node2 15980 1727204188.65819: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204188.68260: done with get_vars() 15980 1727204188.68294: done getting variables 15980 1727204188.68676: variable 'interface' from source: set_fact TASK [Get stat for interface LSR-TST-br31] ************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:56:28 -0400 (0:00:00.123) 0:00:50.097 ***** 15980 1727204188.68710: entering _queue_task() for managed-node2/stat 15980 1727204188.69608: worker is 1 (out of 1 available) 15980 1727204188.69621: exiting _queue_task() for managed-node2/stat 15980 1727204188.69636: done queuing things up, now waiting for results queue to drain 15980 1727204188.69638: waiting for pending results... 15980 1727204188.70235: running TaskExecutor() for managed-node2/TASK: Get stat for interface LSR-TST-br31 15980 1727204188.70842: in run() - task 127b8e07-fff9-5f1d-4b72-0000000004e1 15980 1727204188.70846: variable 'ansible_search_path' from source: unknown 15980 1727204188.70850: variable 'ansible_search_path' from source: unknown 15980 1727204188.70853: calling self._execute() 15980 1727204188.71045: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204188.71212: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204188.71231: variable 'omit' from source: magic vars 15980 1727204188.71667: variable 'ansible_distribution_major_version' from source: facts 15980 1727204188.71688: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204188.71700: variable 'omit' from source: magic vars 15980 1727204188.71760: variable 'omit' from source: magic vars 15980 1727204188.71879: variable 'interface' from source: set_fact 15980 1727204188.71902: variable 'omit' from source: magic vars 15980 1727204188.71955: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204188.72008: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204188.72040: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204188.72269: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204188.72273: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204188.72276: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204188.72278: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204188.72280: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204188.72282: Set connection var ansible_connection to ssh 15980 1727204188.72284: Set connection var ansible_pipelining to False 15980 1727204188.72286: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204188.72288: Set connection var ansible_timeout to 10 15980 1727204188.72290: Set connection var ansible_shell_type to sh 15980 1727204188.72292: Set connection var ansible_shell_executable to /bin/sh 15980 1727204188.72320: variable 'ansible_shell_executable' from source: unknown 15980 1727204188.72331: variable 'ansible_connection' from source: unknown 15980 1727204188.72339: variable 'ansible_module_compression' from source: unknown 15980 1727204188.72345: variable 'ansible_shell_type' from source: unknown 15980 1727204188.72353: variable 'ansible_shell_executable' from source: unknown 15980 1727204188.72360: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204188.72370: variable 'ansible_pipelining' from source: unknown 15980 1727204188.72377: variable 'ansible_timeout' from source: unknown 15980 1727204188.72384: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204188.72622: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15980 1727204188.72645: variable 'omit' from source: magic vars 15980 1727204188.72656: starting attempt loop 15980 1727204188.72662: running the handler 15980 1727204188.72684: _low_level_execute_command(): starting 15980 1727204188.72696: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15980 1727204188.73502: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204188.73517: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204188.73533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204188.73585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204188.73662: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204188.73685: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204188.73838: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204188.74056: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204188.75768: stdout chunk (state=3): >>>/root <<< 15980 1727204188.76086: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204188.76090: stdout chunk (state=3): >>><<< 15980 1727204188.76093: stderr chunk (state=3): >>><<< 15980 1727204188.76342: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204188.76346: _low_level_execute_command(): starting 15980 1727204188.76350: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204188.7623262-19473-124996045148480 `" && echo ansible-tmp-1727204188.7623262-19473-124996045148480="` echo /root/.ansible/tmp/ansible-tmp-1727204188.7623262-19473-124996045148480 `" ) && sleep 0' 15980 1727204188.77346: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204188.77363: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15980 1727204188.77386: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204188.77439: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204188.77534: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204188.77651: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204188.85275: stdout chunk (state=3): >>>ansible-tmp-1727204188.7623262-19473-124996045148480=/root/.ansible/tmp/ansible-tmp-1727204188.7623262-19473-124996045148480 <<< 15980 1727204188.85374: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204188.85388: stdout chunk (state=3): >>><<< 15980 1727204188.85428: stderr chunk (state=3): >>><<< 15980 1727204188.85498: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204188.7623262-19473-124996045148480=/root/.ansible/tmp/ansible-tmp-1727204188.7623262-19473-124996045148480 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204188.85772: variable 'ansible_module_compression' from source: unknown 15980 1727204188.85776: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15980vtkmnsww/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 15980 1727204188.85949: variable 'ansible_facts' from source: unknown 15980 1727204188.86156: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204188.7623262-19473-124996045148480/AnsiballZ_stat.py 15980 1727204188.86380: Sending initial data 15980 1727204188.86414: Sent initial data (153 bytes) 15980 1727204188.87197: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204188.87258: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 15980 1727204188.87273: stderr chunk (state=3): >>>debug2: match found <<< 15980 1727204188.87357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204188.87382: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204188.87490: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204188.89157: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15980 1727204188.89225: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15980 1727204188.89303: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15980vtkmnsww/tmppf6s1716 /root/.ansible/tmp/ansible-tmp-1727204188.7623262-19473-124996045148480/AnsiballZ_stat.py <<< 15980 1727204188.89307: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204188.7623262-19473-124996045148480/AnsiballZ_stat.py" <<< 15980 1727204188.89361: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15980vtkmnsww/tmppf6s1716" to remote "/root/.ansible/tmp/ansible-tmp-1727204188.7623262-19473-124996045148480/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204188.7623262-19473-124996045148480/AnsiballZ_stat.py" <<< 15980 1727204188.90989: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204188.91294: stderr chunk (state=3): >>><<< 15980 1727204188.91298: stdout chunk (state=3): >>><<< 15980 1727204188.91301: done transferring module to remote 15980 1727204188.91303: _low_level_execute_command(): starting 15980 1727204188.91307: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204188.7623262-19473-124996045148480/ /root/.ansible/tmp/ansible-tmp-1727204188.7623262-19473-124996045148480/AnsiballZ_stat.py && sleep 0' 15980 1727204188.92560: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204188.92724: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204188.92909: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204188.92932: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204188.92981: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204188.93043: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204188.95002: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204188.95137: stderr chunk (state=3): >>><<< 15980 1727204188.95156: stdout chunk (state=3): >>><<< 15980 1727204188.95304: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204188.95308: _low_level_execute_command(): starting 15980 1727204188.95311: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204188.7623262-19473-124996045148480/AnsiballZ_stat.py && sleep 0' 15980 1727204188.96757: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204188.96762: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204188.96777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204188.96790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15980 1727204188.96945: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204188.97290: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204188.97579: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204189.13955: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 15980 1727204189.15317: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 15980 1727204189.15335: stdout chunk (state=3): >>><<< 15980 1727204189.15349: stderr chunk (state=3): >>><<< 15980 1727204189.15374: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 15980 1727204189.15421: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204188.7623262-19473-124996045148480/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15980 1727204189.15491: _low_level_execute_command(): starting 15980 1727204189.15494: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204188.7623262-19473-124996045148480/ > /dev/null 2>&1 && sleep 0' 15980 1727204189.16231: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204189.16235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204189.16238: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204189.16240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204189.16294: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204189.16308: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204189.16400: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204189.18491: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204189.18530: stderr chunk (state=3): >>><<< 15980 1727204189.18541: stdout chunk (state=3): >>><<< 15980 1727204189.18576: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204189.18588: handler run complete 15980 1727204189.18615: attempt loop complete, returning result 15980 1727204189.18623: _execute() done 15980 1727204189.18672: dumping result to json 15980 1727204189.18675: done dumping result, returning 15980 1727204189.18678: done running TaskExecutor() for managed-node2/TASK: Get stat for interface LSR-TST-br31 [127b8e07-fff9-5f1d-4b72-0000000004e1] 15980 1727204189.18680: sending task result for task 127b8e07-fff9-5f1d-4b72-0000000004e1 ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 15980 1727204189.18928: no more pending results, returning what we have 15980 1727204189.18932: results queue empty 15980 1727204189.18933: checking for any_errors_fatal 15980 1727204189.18935: done checking for any_errors_fatal 15980 1727204189.18936: checking for max_fail_percentage 15980 1727204189.18937: done checking for max_fail_percentage 15980 1727204189.18938: checking to see if all hosts have failed and the running result is not ok 15980 1727204189.18939: done checking to see if all hosts have failed 15980 1727204189.18940: getting the remaining hosts for this loop 15980 1727204189.18942: done getting the remaining hosts for this loop 15980 1727204189.18945: getting the next task for host managed-node2 15980 1727204189.18954: done getting next task for host managed-node2 15980 1727204189.18957: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 15980 1727204189.18960: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204189.18968: getting variables 15980 1727204189.18970: in VariableManager get_vars() 15980 1727204189.19002: Calling all_inventory to load vars for managed-node2 15980 1727204189.19004: Calling groups_inventory to load vars for managed-node2 15980 1727204189.19008: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204189.19022: Calling all_plugins_play to load vars for managed-node2 15980 1727204189.19027: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204189.19031: Calling groups_plugins_play to load vars for managed-node2 15980 1727204189.19584: done sending task result for task 127b8e07-fff9-5f1d-4b72-0000000004e1 15980 1727204189.19589: WORKER PROCESS EXITING 15980 1727204189.21172: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204189.23808: done with get_vars() 15980 1727204189.23842: done getting variables 15980 1727204189.23894: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15980 1727204189.23994: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'LSR-TST-br31'] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Tuesday 24 September 2024 14:56:29 -0400 (0:00:00.553) 0:00:50.650 ***** 15980 1727204189.24021: entering _queue_task() for managed-node2/assert 15980 1727204189.24319: worker is 1 (out of 1 available) 15980 1727204189.24338: exiting _queue_task() for managed-node2/assert 15980 1727204189.24352: done queuing things up, now waiting for results queue to drain 15980 1727204189.24354: waiting for pending results... 15980 1727204189.24544: running TaskExecutor() for managed-node2/TASK: Assert that the interface is absent - 'LSR-TST-br31' 15980 1727204189.24612: in run() - task 127b8e07-fff9-5f1d-4b72-0000000004d7 15980 1727204189.24624: variable 'ansible_search_path' from source: unknown 15980 1727204189.24631: variable 'ansible_search_path' from source: unknown 15980 1727204189.24662: calling self._execute() 15980 1727204189.24747: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204189.24754: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204189.24762: variable 'omit' from source: magic vars 15980 1727204189.25271: variable 'ansible_distribution_major_version' from source: facts 15980 1727204189.25274: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204189.25277: variable 'omit' from source: magic vars 15980 1727204189.25279: variable 'omit' from source: magic vars 15980 1727204189.25374: variable 'interface' from source: set_fact 15980 1727204189.25401: variable 'omit' from source: magic vars 15980 1727204189.25450: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204189.25499: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204189.25531: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204189.25558: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204189.25578: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204189.25617: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204189.25627: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204189.25635: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204189.25750: Set connection var ansible_connection to ssh 15980 1727204189.25777: Set connection var ansible_pipelining to False 15980 1727204189.25781: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204189.25783: Set connection var ansible_timeout to 10 15980 1727204189.25791: Set connection var ansible_shell_type to sh 15980 1727204189.25812: Set connection var ansible_shell_executable to /bin/sh 15980 1727204189.25829: variable 'ansible_shell_executable' from source: unknown 15980 1727204189.25833: variable 'ansible_connection' from source: unknown 15980 1727204189.25835: variable 'ansible_module_compression' from source: unknown 15980 1727204189.25841: variable 'ansible_shell_type' from source: unknown 15980 1727204189.25848: variable 'ansible_shell_executable' from source: unknown 15980 1727204189.25855: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204189.25863: variable 'ansible_pipelining' from source: unknown 15980 1727204189.25872: variable 'ansible_timeout' from source: unknown 15980 1727204189.25882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204189.26154: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15980 1727204189.26307: variable 'omit' from source: magic vars 15980 1727204189.26329: starting attempt loop 15980 1727204189.26337: running the handler 15980 1727204189.26888: variable 'interface_stat' from source: set_fact 15980 1727204189.27070: Evaluated conditional (not interface_stat.stat.exists): True 15980 1727204189.27074: handler run complete 15980 1727204189.27076: attempt loop complete, returning result 15980 1727204189.27079: _execute() done 15980 1727204189.27081: dumping result to json 15980 1727204189.27084: done dumping result, returning 15980 1727204189.27086: done running TaskExecutor() for managed-node2/TASK: Assert that the interface is absent - 'LSR-TST-br31' [127b8e07-fff9-5f1d-4b72-0000000004d7] 15980 1727204189.27089: sending task result for task 127b8e07-fff9-5f1d-4b72-0000000004d7 15980 1727204189.27177: done sending task result for task 127b8e07-fff9-5f1d-4b72-0000000004d7 15980 1727204189.27182: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 15980 1727204189.27244: no more pending results, returning what we have 15980 1727204189.27248: results queue empty 15980 1727204189.27250: checking for any_errors_fatal 15980 1727204189.27260: done checking for any_errors_fatal 15980 1727204189.27261: checking for max_fail_percentage 15980 1727204189.27263: done checking for max_fail_percentage 15980 1727204189.27264: checking to see if all hosts have failed and the running result is not ok 15980 1727204189.27267: done checking to see if all hosts have failed 15980 1727204189.27268: getting the remaining hosts for this loop 15980 1727204189.27270: done getting the remaining hosts for this loop 15980 1727204189.27275: getting the next task for host managed-node2 15980 1727204189.27285: done getting next task for host managed-node2 15980 1727204189.27288: ^ task is: TASK: meta (flush_handlers) 15980 1727204189.27290: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204189.27295: getting variables 15980 1727204189.27298: in VariableManager get_vars() 15980 1727204189.27335: Calling all_inventory to load vars for managed-node2 15980 1727204189.27339: Calling groups_inventory to load vars for managed-node2 15980 1727204189.27343: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204189.27357: Calling all_plugins_play to load vars for managed-node2 15980 1727204189.27361: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204189.27363: Calling groups_plugins_play to load vars for managed-node2 15980 1727204189.28813: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204189.30596: done with get_vars() 15980 1727204189.30639: done getting variables 15980 1727204189.30708: in VariableManager get_vars() 15980 1727204189.30716: Calling all_inventory to load vars for managed-node2 15980 1727204189.30718: Calling groups_inventory to load vars for managed-node2 15980 1727204189.30720: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204189.30724: Calling all_plugins_play to load vars for managed-node2 15980 1727204189.30728: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204189.30730: Calling groups_plugins_play to load vars for managed-node2 15980 1727204189.32019: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204189.36812: done with get_vars() 15980 1727204189.36937: done queuing things up, now waiting for results queue to drain 15980 1727204189.36943: results queue empty 15980 1727204189.36945: checking for any_errors_fatal 15980 1727204189.36948: done checking for any_errors_fatal 15980 1727204189.36949: checking for max_fail_percentage 15980 1727204189.36950: done checking for max_fail_percentage 15980 1727204189.36951: checking to see if all hosts have failed and the running result is not ok 15980 1727204189.36952: done checking to see if all hosts have failed 15980 1727204189.36958: getting the remaining hosts for this loop 15980 1727204189.36960: done getting the remaining hosts for this loop 15980 1727204189.37030: getting the next task for host managed-node2 15980 1727204189.37036: done getting next task for host managed-node2 15980 1727204189.37037: ^ task is: TASK: meta (flush_handlers) 15980 1727204189.37039: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204189.37042: getting variables 15980 1727204189.37044: in VariableManager get_vars() 15980 1727204189.37055: Calling all_inventory to load vars for managed-node2 15980 1727204189.37057: Calling groups_inventory to load vars for managed-node2 15980 1727204189.37060: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204189.37110: Calling all_plugins_play to load vars for managed-node2 15980 1727204189.37118: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204189.37123: Calling groups_plugins_play to load vars for managed-node2 15980 1727204189.39411: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204189.42475: done with get_vars() 15980 1727204189.42520: done getting variables 15980 1727204189.42651: in VariableManager get_vars() 15980 1727204189.42831: Calling all_inventory to load vars for managed-node2 15980 1727204189.42835: Calling groups_inventory to load vars for managed-node2 15980 1727204189.42838: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204189.42845: Calling all_plugins_play to load vars for managed-node2 15980 1727204189.42847: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204189.42850: Calling groups_plugins_play to load vars for managed-node2 15980 1727204189.45407: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204189.48218: done with get_vars() 15980 1727204189.48314: done queuing things up, now waiting for results queue to drain 15980 1727204189.48317: results queue empty 15980 1727204189.48318: checking for any_errors_fatal 15980 1727204189.48320: done checking for any_errors_fatal 15980 1727204189.48320: checking for max_fail_percentage 15980 1727204189.48322: done checking for max_fail_percentage 15980 1727204189.48323: checking to see if all hosts have failed and the running result is not ok 15980 1727204189.48324: done checking to see if all hosts have failed 15980 1727204189.48327: getting the remaining hosts for this loop 15980 1727204189.48328: done getting the remaining hosts for this loop 15980 1727204189.48331: getting the next task for host managed-node2 15980 1727204189.48335: done getting next task for host managed-node2 15980 1727204189.48336: ^ task is: None 15980 1727204189.48338: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204189.48339: done queuing things up, now waiting for results queue to drain 15980 1727204189.48340: results queue empty 15980 1727204189.48341: checking for any_errors_fatal 15980 1727204189.48341: done checking for any_errors_fatal 15980 1727204189.48342: checking for max_fail_percentage 15980 1727204189.48343: done checking for max_fail_percentage 15980 1727204189.48344: checking to see if all hosts have failed and the running result is not ok 15980 1727204189.48345: done checking to see if all hosts have failed 15980 1727204189.48346: getting the next task for host managed-node2 15980 1727204189.48349: done getting next task for host managed-node2 15980 1727204189.48350: ^ task is: None 15980 1727204189.48351: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204189.48400: in VariableManager get_vars() 15980 1727204189.48450: done with get_vars() 15980 1727204189.48459: in VariableManager get_vars() 15980 1727204189.48474: done with get_vars() 15980 1727204189.48479: variable 'omit' from source: magic vars 15980 1727204189.48514: in VariableManager get_vars() 15980 1727204189.48528: done with get_vars() 15980 1727204189.48552: variable 'omit' from source: magic vars PLAY [Verify that cleanup restored state to default] *************************** 15980 1727204189.48901: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15980 1727204189.48952: getting the remaining hosts for this loop 15980 1727204189.48953: done getting the remaining hosts for this loop 15980 1727204189.48956: getting the next task for host managed-node2 15980 1727204189.48959: done getting next task for host managed-node2 15980 1727204189.48961: ^ task is: TASK: Gathering Facts 15980 1727204189.48962: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204189.48965: getting variables 15980 1727204189.48968: in VariableManager get_vars() 15980 1727204189.48977: Calling all_inventory to load vars for managed-node2 15980 1727204189.48980: Calling groups_inventory to load vars for managed-node2 15980 1727204189.48982: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204189.48988: Calling all_plugins_play to load vars for managed-node2 15980 1727204189.48991: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204189.48994: Calling groups_plugins_play to load vars for managed-node2 15980 1727204189.51477: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204189.53419: done with get_vars() 15980 1727204189.53446: done getting variables 15980 1727204189.53499: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:64 Tuesday 24 September 2024 14:56:29 -0400 (0:00:00.295) 0:00:50.945 ***** 15980 1727204189.53527: entering _queue_task() for managed-node2/gather_facts 15980 1727204189.54118: worker is 1 (out of 1 available) 15980 1727204189.54129: exiting _queue_task() for managed-node2/gather_facts 15980 1727204189.54140: done queuing things up, now waiting for results queue to drain 15980 1727204189.54143: waiting for pending results... 15980 1727204189.54655: running TaskExecutor() for managed-node2/TASK: Gathering Facts 15980 1727204189.54913: in run() - task 127b8e07-fff9-5f1d-4b72-0000000004fa 15980 1727204189.54942: variable 'ansible_search_path' from source: unknown 15980 1727204189.54994: calling self._execute() 15980 1727204189.55134: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204189.55172: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204189.55176: variable 'omit' from source: magic vars 15980 1727204189.55608: variable 'ansible_distribution_major_version' from source: facts 15980 1727204189.55638: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204189.55642: variable 'omit' from source: magic vars 15980 1727204189.55660: variable 'omit' from source: magic vars 15980 1727204189.55695: variable 'omit' from source: magic vars 15980 1727204189.55780: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204189.55817: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204189.55871: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204189.55972: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204189.55975: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204189.55978: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204189.55981: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204189.55983: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204189.56079: Set connection var ansible_connection to ssh 15980 1727204189.56095: Set connection var ansible_pipelining to False 15980 1727204189.56107: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204189.56118: Set connection var ansible_timeout to 10 15980 1727204189.56133: Set connection var ansible_shell_type to sh 15980 1727204189.56145: Set connection var ansible_shell_executable to /bin/sh 15980 1727204189.56186: variable 'ansible_shell_executable' from source: unknown 15980 1727204189.56195: variable 'ansible_connection' from source: unknown 15980 1727204189.56202: variable 'ansible_module_compression' from source: unknown 15980 1727204189.56371: variable 'ansible_shell_type' from source: unknown 15980 1727204189.56375: variable 'ansible_shell_executable' from source: unknown 15980 1727204189.56377: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204189.56380: variable 'ansible_pipelining' from source: unknown 15980 1727204189.56382: variable 'ansible_timeout' from source: unknown 15980 1727204189.56384: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204189.56449: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15980 1727204189.56470: variable 'omit' from source: magic vars 15980 1727204189.56483: starting attempt loop 15980 1727204189.56489: running the handler 15980 1727204189.56510: variable 'ansible_facts' from source: unknown 15980 1727204189.56537: _low_level_execute_command(): starting 15980 1727204189.56550: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15980 1727204189.57847: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204189.57974: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204189.58003: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204189.58102: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204189.59906: stdout chunk (state=3): >>>/root <<< 15980 1727204189.60101: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204189.60115: stdout chunk (state=3): >>><<< 15980 1727204189.60132: stderr chunk (state=3): >>><<< 15980 1727204189.60161: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204189.60186: _low_level_execute_command(): starting 15980 1727204189.60207: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204189.6016986-19517-90505321450864 `" && echo ansible-tmp-1727204189.6016986-19517-90505321450864="` echo /root/.ansible/tmp/ansible-tmp-1727204189.6016986-19517-90505321450864 `" ) && sleep 0' 15980 1727204189.61017: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204189.61104: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204189.61110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204189.61180: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204189.61199: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204189.61255: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204189.61415: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204189.63408: stdout chunk (state=3): >>>ansible-tmp-1727204189.6016986-19517-90505321450864=/root/.ansible/tmp/ansible-tmp-1727204189.6016986-19517-90505321450864 <<< 15980 1727204189.63640: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204189.63645: stdout chunk (state=3): >>><<< 15980 1727204189.63648: stderr chunk (state=3): >>><<< 15980 1727204189.63673: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204189.6016986-19517-90505321450864=/root/.ansible/tmp/ansible-tmp-1727204189.6016986-19517-90505321450864 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204189.63736: variable 'ansible_module_compression' from source: unknown 15980 1727204189.63791: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15980vtkmnsww/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15980 1727204189.64070: variable 'ansible_facts' from source: unknown 15980 1727204189.64073: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204189.6016986-19517-90505321450864/AnsiballZ_setup.py 15980 1727204189.64334: Sending initial data 15980 1727204189.64338: Sent initial data (153 bytes) 15980 1727204189.65044: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204189.65084: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204189.65190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204189.65218: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204189.65244: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204189.65414: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204189.67183: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 15980 1727204189.67227: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15980 1727204189.67278: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15980 1727204189.67386: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15980vtkmnsww/tmpud1jhspo /root/.ansible/tmp/ansible-tmp-1727204189.6016986-19517-90505321450864/AnsiballZ_setup.py <<< 15980 1727204189.67390: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204189.6016986-19517-90505321450864/AnsiballZ_setup.py" <<< 15980 1727204189.67523: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15980vtkmnsww/tmpud1jhspo" to remote "/root/.ansible/tmp/ansible-tmp-1727204189.6016986-19517-90505321450864/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204189.6016986-19517-90505321450864/AnsiballZ_setup.py" <<< 15980 1727204189.70271: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204189.70436: stderr chunk (state=3): >>><<< 15980 1727204189.70440: stdout chunk (state=3): >>><<< 15980 1727204189.70442: done transferring module to remote 15980 1727204189.70445: _low_level_execute_command(): starting 15980 1727204189.70451: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204189.6016986-19517-90505321450864/ /root/.ansible/tmp/ansible-tmp-1727204189.6016986-19517-90505321450864/AnsiballZ_setup.py && sleep 0' 15980 1727204189.71123: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204189.71249: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204189.71274: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204189.71304: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204189.71410: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204189.73376: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204189.73484: stderr chunk (state=3): >>><<< 15980 1727204189.73505: stdout chunk (state=3): >>><<< 15980 1727204189.73529: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204189.73541: _low_level_execute_command(): starting 15980 1727204189.73553: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204189.6016986-19517-90505321450864/AnsiballZ_setup.py && sleep 0' 15980 1727204189.74260: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15980 1727204189.74279: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15980 1727204189.74298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204189.74319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15980 1727204189.74338: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 15980 1727204189.74420: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204189.74453: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204189.74473: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204189.74499: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204189.74615: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204190.40760: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_is_chroot": false, "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_iscsi_iqn": "", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_fips": false, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "30", "epoch": "1727204190", "epoch_int": "1727204190", "date": "2024-09-24", "time": "14:56:30", "iso8601_micro": "2024-09-24T18:56:30.047013Z", "iso8601": "2024-09-24T18:56:30Z", "iso8601_basic": "20240924T145630047013", "iso8601_basic_short": "20240924T145630", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_loadavg": {"1m": 0.66552734375, "5m": 0.525390625, "15m": 0.267578125}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3051, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 665, "free": 3051}, "nocache": {"free": 3481, "used": 235}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_uuid": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 536, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251325505536, "block_size": 4096, "block_total": 64479564, "block_available": 61358766, "block_used": 3120798, "inode_total": 16384000, "inode_available": 16301510, "inode_used": 82490, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_lsb": {}, "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::f7:13ff:fe22:8fc1", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.73"], "ansible_all_ipv6_addresses": ["fe80::f7:13ff:fe22:8fc1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.73", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::f7:13ff:fe22:8fc1"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15980 1727204190.42830: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 15980 1727204190.42900: stderr chunk (state=3): >>><<< 15980 1727204190.42904: stdout chunk (state=3): >>><<< 15980 1727204190.42929: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_is_chroot": false, "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_iscsi_iqn": "", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_fips": false, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "30", "epoch": "1727204190", "epoch_int": "1727204190", "date": "2024-09-24", "time": "14:56:30", "iso8601_micro": "2024-09-24T18:56:30.047013Z", "iso8601": "2024-09-24T18:56:30Z", "iso8601_basic": "20240924T145630047013", "iso8601_basic_short": "20240924T145630", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_loadavg": {"1m": 0.66552734375, "5m": 0.525390625, "15m": 0.267578125}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3051, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 665, "free": 3051}, "nocache": {"free": 3481, "used": 235}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_uuid": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 536, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251325505536, "block_size": 4096, "block_total": 64479564, "block_available": 61358766, "block_used": 3120798, "inode_total": 16384000, "inode_available": 16301510, "inode_used": 82490, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_lsb": {}, "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::f7:13ff:fe22:8fc1", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.73"], "ansible_all_ipv6_addresses": ["fe80::f7:13ff:fe22:8fc1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.73", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::f7:13ff:fe22:8fc1"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 15980 1727204190.43295: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204189.6016986-19517-90505321450864/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15980 1727204190.43318: _low_level_execute_command(): starting 15980 1727204190.43321: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204189.6016986-19517-90505321450864/ > /dev/null 2>&1 && sleep 0' 15980 1727204190.43831: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204190.43835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204190.43837: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204190.43839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204190.43901: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204190.43906: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204190.43909: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204190.43984: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204190.45878: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204190.45943: stderr chunk (state=3): >>><<< 15980 1727204190.45947: stdout chunk (state=3): >>><<< 15980 1727204190.45962: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204190.45972: handler run complete 15980 1727204190.46061: variable 'ansible_facts' from source: unknown 15980 1727204190.46140: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204190.46350: variable 'ansible_facts' from source: unknown 15980 1727204190.46411: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204190.46496: attempt loop complete, returning result 15980 1727204190.46499: _execute() done 15980 1727204190.46502: dumping result to json 15980 1727204190.46518: done dumping result, returning 15980 1727204190.46526: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [127b8e07-fff9-5f1d-4b72-0000000004fa] 15980 1727204190.46535: sending task result for task 127b8e07-fff9-5f1d-4b72-0000000004fa 15980 1727204190.46770: done sending task result for task 127b8e07-fff9-5f1d-4b72-0000000004fa 15980 1727204190.46773: WORKER PROCESS EXITING ok: [managed-node2] 15980 1727204190.47053: no more pending results, returning what we have 15980 1727204190.47055: results queue empty 15980 1727204190.47056: checking for any_errors_fatal 15980 1727204190.47056: done checking for any_errors_fatal 15980 1727204190.47057: checking for max_fail_percentage 15980 1727204190.47058: done checking for max_fail_percentage 15980 1727204190.47059: checking to see if all hosts have failed and the running result is not ok 15980 1727204190.47059: done checking to see if all hosts have failed 15980 1727204190.47060: getting the remaining hosts for this loop 15980 1727204190.47061: done getting the remaining hosts for this loop 15980 1727204190.47064: getting the next task for host managed-node2 15980 1727204190.47069: done getting next task for host managed-node2 15980 1727204190.47071: ^ task is: TASK: meta (flush_handlers) 15980 1727204190.47073: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204190.47075: getting variables 15980 1727204190.47076: in VariableManager get_vars() 15980 1727204190.47094: Calling all_inventory to load vars for managed-node2 15980 1727204190.47096: Calling groups_inventory to load vars for managed-node2 15980 1727204190.47098: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204190.47114: Calling all_plugins_play to load vars for managed-node2 15980 1727204190.47116: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204190.47118: Calling groups_plugins_play to load vars for managed-node2 15980 1727204190.48097: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204190.49317: done with get_vars() 15980 1727204190.49349: done getting variables 15980 1727204190.49413: in VariableManager get_vars() 15980 1727204190.49422: Calling all_inventory to load vars for managed-node2 15980 1727204190.49424: Calling groups_inventory to load vars for managed-node2 15980 1727204190.49428: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204190.49432: Calling all_plugins_play to load vars for managed-node2 15980 1727204190.49434: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204190.49436: Calling groups_plugins_play to load vars for managed-node2 15980 1727204190.54575: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204190.55755: done with get_vars() 15980 1727204190.55792: done queuing things up, now waiting for results queue to drain 15980 1727204190.55795: results queue empty 15980 1727204190.55796: checking for any_errors_fatal 15980 1727204190.55799: done checking for any_errors_fatal 15980 1727204190.55800: checking for max_fail_percentage 15980 1727204190.55801: done checking for max_fail_percentage 15980 1727204190.55806: checking to see if all hosts have failed and the running result is not ok 15980 1727204190.55806: done checking to see if all hosts have failed 15980 1727204190.55807: getting the remaining hosts for this loop 15980 1727204190.55808: done getting the remaining hosts for this loop 15980 1727204190.55810: getting the next task for host managed-node2 15980 1727204190.55813: done getting next task for host managed-node2 15980 1727204190.55815: ^ task is: TASK: Verify network state restored to default 15980 1727204190.55816: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204190.55818: getting variables 15980 1727204190.55818: in VariableManager get_vars() 15980 1727204190.55828: Calling all_inventory to load vars for managed-node2 15980 1727204190.55829: Calling groups_inventory to load vars for managed-node2 15980 1727204190.55831: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204190.55837: Calling all_plugins_play to load vars for managed-node2 15980 1727204190.55838: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204190.55840: Calling groups_plugins_play to load vars for managed-node2 15980 1727204190.56724: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204190.57981: done with get_vars() 15980 1727204190.58001: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:67 Tuesday 24 September 2024 14:56:30 -0400 (0:00:01.045) 0:00:51.990 ***** 15980 1727204190.58068: entering _queue_task() for managed-node2/include_tasks 15980 1727204190.58419: worker is 1 (out of 1 available) 15980 1727204190.58434: exiting _queue_task() for managed-node2/include_tasks 15980 1727204190.58448: done queuing things up, now waiting for results queue to drain 15980 1727204190.58451: waiting for pending results... 15980 1727204190.58647: running TaskExecutor() for managed-node2/TASK: Verify network state restored to default 15980 1727204190.58732: in run() - task 127b8e07-fff9-5f1d-4b72-00000000007a 15980 1727204190.58743: variable 'ansible_search_path' from source: unknown 15980 1727204190.58778: calling self._execute() 15980 1727204190.58859: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204190.58863: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204190.58874: variable 'omit' from source: magic vars 15980 1727204190.59193: variable 'ansible_distribution_major_version' from source: facts 15980 1727204190.59203: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204190.59209: _execute() done 15980 1727204190.59213: dumping result to json 15980 1727204190.59216: done dumping result, returning 15980 1727204190.59227: done running TaskExecutor() for managed-node2/TASK: Verify network state restored to default [127b8e07-fff9-5f1d-4b72-00000000007a] 15980 1727204190.59232: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000007a 15980 1727204190.59338: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000007a 15980 1727204190.59342: WORKER PROCESS EXITING 15980 1727204190.59375: no more pending results, returning what we have 15980 1727204190.59380: in VariableManager get_vars() 15980 1727204190.59417: Calling all_inventory to load vars for managed-node2 15980 1727204190.59420: Calling groups_inventory to load vars for managed-node2 15980 1727204190.59424: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204190.59441: Calling all_plugins_play to load vars for managed-node2 15980 1727204190.59444: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204190.59447: Calling groups_plugins_play to load vars for managed-node2 15980 1727204190.60511: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204190.61717: done with get_vars() 15980 1727204190.61743: variable 'ansible_search_path' from source: unknown 15980 1727204190.61759: we have included files to process 15980 1727204190.61760: generating all_blocks data 15980 1727204190.61762: done generating all_blocks data 15980 1727204190.61762: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 15980 1727204190.61763: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 15980 1727204190.61766: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 15980 1727204190.62088: done processing included file 15980 1727204190.62090: iterating over new_blocks loaded from include file 15980 1727204190.62091: in VariableManager get_vars() 15980 1727204190.62101: done with get_vars() 15980 1727204190.62103: filtering new block on tags 15980 1727204190.62115: done filtering new block on tags 15980 1727204190.62117: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed-node2 15980 1727204190.62121: extending task lists for all hosts with included blocks 15980 1727204190.62143: done extending task lists 15980 1727204190.62144: done processing included files 15980 1727204190.62145: results queue empty 15980 1727204190.62145: checking for any_errors_fatal 15980 1727204190.62147: done checking for any_errors_fatal 15980 1727204190.62147: checking for max_fail_percentage 15980 1727204190.62148: done checking for max_fail_percentage 15980 1727204190.62148: checking to see if all hosts have failed and the running result is not ok 15980 1727204190.62149: done checking to see if all hosts have failed 15980 1727204190.62149: getting the remaining hosts for this loop 15980 1727204190.62150: done getting the remaining hosts for this loop 15980 1727204190.62152: getting the next task for host managed-node2 15980 1727204190.62154: done getting next task for host managed-node2 15980 1727204190.62155: ^ task is: TASK: Check routes and DNS 15980 1727204190.62157: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204190.62159: getting variables 15980 1727204190.62160: in VariableManager get_vars() 15980 1727204190.62168: Calling all_inventory to load vars for managed-node2 15980 1727204190.62170: Calling groups_inventory to load vars for managed-node2 15980 1727204190.62171: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204190.62176: Calling all_plugins_play to load vars for managed-node2 15980 1727204190.62178: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204190.62181: Calling groups_plugins_play to load vars for managed-node2 15980 1727204190.63113: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204190.64296: done with get_vars() 15980 1727204190.64323: done getting variables 15980 1727204190.64367: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Tuesday 24 September 2024 14:56:30 -0400 (0:00:00.063) 0:00:52.054 ***** 15980 1727204190.64394: entering _queue_task() for managed-node2/shell 15980 1727204190.64695: worker is 1 (out of 1 available) 15980 1727204190.64710: exiting _queue_task() for managed-node2/shell 15980 1727204190.64723: done queuing things up, now waiting for results queue to drain 15980 1727204190.64726: waiting for pending results... 15980 1727204190.64928: running TaskExecutor() for managed-node2/TASK: Check routes and DNS 15980 1727204190.65019: in run() - task 127b8e07-fff9-5f1d-4b72-00000000050b 15980 1727204190.65034: variable 'ansible_search_path' from source: unknown 15980 1727204190.65038: variable 'ansible_search_path' from source: unknown 15980 1727204190.65076: calling self._execute() 15980 1727204190.65157: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204190.65161: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204190.65177: variable 'omit' from source: magic vars 15980 1727204190.65498: variable 'ansible_distribution_major_version' from source: facts 15980 1727204190.65516: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204190.65520: variable 'omit' from source: magic vars 15980 1727204190.65551: variable 'omit' from source: magic vars 15980 1727204190.65581: variable 'omit' from source: magic vars 15980 1727204190.65617: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15980 1727204190.65653: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15980 1727204190.65671: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15980 1727204190.65687: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204190.65699: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15980 1727204190.65737: variable 'inventory_hostname' from source: host vars for 'managed-node2' 15980 1727204190.65741: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204190.65743: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204190.65809: Set connection var ansible_connection to ssh 15980 1727204190.65816: Set connection var ansible_pipelining to False 15980 1727204190.65823: Set connection var ansible_module_compression to ZIP_DEFLATED 15980 1727204190.65831: Set connection var ansible_timeout to 10 15980 1727204190.65843: Set connection var ansible_shell_type to sh 15980 1727204190.65850: Set connection var ansible_shell_executable to /bin/sh 15980 1727204190.65875: variable 'ansible_shell_executable' from source: unknown 15980 1727204190.65883: variable 'ansible_connection' from source: unknown 15980 1727204190.65887: variable 'ansible_module_compression' from source: unknown 15980 1727204190.65889: variable 'ansible_shell_type' from source: unknown 15980 1727204190.65892: variable 'ansible_shell_executable' from source: unknown 15980 1727204190.65894: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204190.65899: variable 'ansible_pipelining' from source: unknown 15980 1727204190.65902: variable 'ansible_timeout' from source: unknown 15980 1727204190.65907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204190.66032: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15980 1727204190.66041: variable 'omit' from source: magic vars 15980 1727204190.66046: starting attempt loop 15980 1727204190.66049: running the handler 15980 1727204190.66061: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15980 1727204190.66080: _low_level_execute_command(): starting 15980 1727204190.66086: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15980 1727204190.66664: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204190.66672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204190.66675: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204190.66677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204190.66721: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204190.66740: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204190.66823: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204190.68562: stdout chunk (state=3): >>>/root <<< 15980 1727204190.68670: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204190.68734: stderr chunk (state=3): >>><<< 15980 1727204190.68738: stdout chunk (state=3): >>><<< 15980 1727204190.68761: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204190.68778: _low_level_execute_command(): starting 15980 1727204190.68788: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204190.6876214-19556-151738214574771 `" && echo ansible-tmp-1727204190.6876214-19556-151738214574771="` echo /root/.ansible/tmp/ansible-tmp-1727204190.6876214-19556-151738214574771 `" ) && sleep 0' 15980 1727204190.69298: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204190.69302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15980 1727204190.69315: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 15980 1727204190.69319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204190.69369: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204190.69373: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204190.69375: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204190.69452: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204190.71443: stdout chunk (state=3): >>>ansible-tmp-1727204190.6876214-19556-151738214574771=/root/.ansible/tmp/ansible-tmp-1727204190.6876214-19556-151738214574771 <<< 15980 1727204190.71550: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204190.71620: stderr chunk (state=3): >>><<< 15980 1727204190.71623: stdout chunk (state=3): >>><<< 15980 1727204190.71640: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204190.6876214-19556-151738214574771=/root/.ansible/tmp/ansible-tmp-1727204190.6876214-19556-151738214574771 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204190.71673: variable 'ansible_module_compression' from source: unknown 15980 1727204190.71724: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15980vtkmnsww/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15980 1727204190.71761: variable 'ansible_facts' from source: unknown 15980 1727204190.71823: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204190.6876214-19556-151738214574771/AnsiballZ_command.py 15980 1727204190.71947: Sending initial data 15980 1727204190.71951: Sent initial data (156 bytes) 15980 1727204190.72462: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204190.72469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 15980 1727204190.72472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15980 1727204190.72474: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204190.72477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204190.72535: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204190.72539: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204190.72541: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204190.72620: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204190.74236: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15980 1727204190.74305: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15980 1727204190.74378: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15980vtkmnsww/tmp9woorisd /root/.ansible/tmp/ansible-tmp-1727204190.6876214-19556-151738214574771/AnsiballZ_command.py <<< 15980 1727204190.74382: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204190.6876214-19556-151738214574771/AnsiballZ_command.py" <<< 15980 1727204190.74450: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15980vtkmnsww/tmp9woorisd" to remote "/root/.ansible/tmp/ansible-tmp-1727204190.6876214-19556-151738214574771/AnsiballZ_command.py" <<< 15980 1727204190.74454: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204190.6876214-19556-151738214574771/AnsiballZ_command.py" <<< 15980 1727204190.75136: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204190.75218: stderr chunk (state=3): >>><<< 15980 1727204190.75222: stdout chunk (state=3): >>><<< 15980 1727204190.75245: done transferring module to remote 15980 1727204190.75255: _low_level_execute_command(): starting 15980 1727204190.75260: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204190.6876214-19556-151738214574771/ /root/.ansible/tmp/ansible-tmp-1727204190.6876214-19556-151738214574771/AnsiballZ_command.py && sleep 0' 15980 1727204190.75762: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204190.75773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204190.75777: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 15980 1727204190.75784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204190.75827: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204190.75831: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204190.75835: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204190.75904: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204190.77741: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204190.77804: stderr chunk (state=3): >>><<< 15980 1727204190.77808: stdout chunk (state=3): >>><<< 15980 1727204190.77827: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204190.77831: _low_level_execute_command(): starting 15980 1727204190.77834: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204190.6876214-19556-151738214574771/AnsiballZ_command.py && sleep 0' 15980 1727204190.78347: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204190.78351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204190.78355: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 15980 1727204190.78358: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204190.78360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204190.78418: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204190.78421: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204190.78428: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204190.78503: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204190.95954: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 02:f7:13:22:8f:c1 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.47.73/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0\n valid_lft 3324sec preferred_lft 3324sec\n inet6 fe80::f7:13ff:fe22:8fc1/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.44.1 dev eth0 proto dhcp src 10.31.47.73 metric 100 \n10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.47.73 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8).\n# Do not edit.\n#\n# This file might be symlinked as /etc/resolv.conf. If you're looking at\n# /etc/resolv.conf and seeing this text, you have followed the symlink.\n#\n# This is a dynamic resolv.conf file for connecting local clients to the\n# internal DNS stub resolver of systemd-resolved. This file lists all\n# configured search domains.\n#\n# Run \"resolvectl status\" to see details about the uplink DNS servers\n# currently in use.\n#\n# Third party programs should typically not access this file directly, but only\n# through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a\n# different way, replace this symlink by a static file or a different symlink.\n#\n# See man:systemd-resolved.service(8) for details about the supported modes of\n# operation for /etc/resolv.conf.\n\nnameserver 127.0.0.53\noptions edns0 trust-ad\nsearch us-east-1.aws.redhat.com", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 14:56:30.948619", "end": "2024-09-24 14:56:30.957862", "delta": "0:00:00.009243", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15980 1727204190.97541: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 15980 1727204190.97594: stderr chunk (state=3): >>><<< 15980 1727204190.97598: stdout chunk (state=3): >>><<< 15980 1727204190.97616: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 02:f7:13:22:8f:c1 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.47.73/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0\n valid_lft 3324sec preferred_lft 3324sec\n inet6 fe80::f7:13ff:fe22:8fc1/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.44.1 dev eth0 proto dhcp src 10.31.47.73 metric 100 \n10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.47.73 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8).\n# Do not edit.\n#\n# This file might be symlinked as /etc/resolv.conf. If you're looking at\n# /etc/resolv.conf and seeing this text, you have followed the symlink.\n#\n# This is a dynamic resolv.conf file for connecting local clients to the\n# internal DNS stub resolver of systemd-resolved. This file lists all\n# configured search domains.\n#\n# Run \"resolvectl status\" to see details about the uplink DNS servers\n# currently in use.\n#\n# Third party programs should typically not access this file directly, but only\n# through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a\n# different way, replace this symlink by a static file or a different symlink.\n#\n# See man:systemd-resolved.service(8) for details about the supported modes of\n# operation for /etc/resolv.conf.\n\nnameserver 127.0.0.53\noptions edns0 trust-ad\nsearch us-east-1.aws.redhat.com", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 14:56:30.948619", "end": "2024-09-24 14:56:30.957862", "delta": "0:00:00.009243", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 15980 1727204190.97665: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204190.6876214-19556-151738214574771/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15980 1727204190.97675: _low_level_execute_command(): starting 15980 1727204190.97680: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204190.6876214-19556-151738214574771/ > /dev/null 2>&1 && sleep 0' 15980 1727204190.98174: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204190.98177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204190.98180: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15980 1727204190.98188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 15980 1727204190.98206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15980 1727204190.98236: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 15980 1727204190.98240: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15980 1727204190.98250: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15980 1727204190.98324: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15980 1727204191.00261: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15980 1727204191.00336: stderr chunk (state=3): >>><<< 15980 1727204191.00340: stdout chunk (state=3): >>><<< 15980 1727204191.00361: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15980 1727204191.00373: handler run complete 15980 1727204191.00396: Evaluated conditional (False): False 15980 1727204191.00405: attempt loop complete, returning result 15980 1727204191.00413: _execute() done 15980 1727204191.00415: dumping result to json 15980 1727204191.00442: done dumping result, returning 15980 1727204191.00445: done running TaskExecutor() for managed-node2/TASK: Check routes and DNS [127b8e07-fff9-5f1d-4b72-00000000050b] 15980 1727204191.00447: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000050b 15980 1727204191.00574: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000050b 15980 1727204191.00577: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.009243", "end": "2024-09-24 14:56:30.957862", "rc": 0, "start": "2024-09-24 14:56:30.948619" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 02:f7:13:22:8f:c1 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.47.73/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0 valid_lft 3324sec preferred_lft 3324sec inet6 fe80::f7:13ff:fe22:8fc1/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.44.1 dev eth0 proto dhcp src 10.31.47.73 metric 100 10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.47.73 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8). # Do not edit. # # This file might be symlinked as /etc/resolv.conf. If you're looking at # /etc/resolv.conf and seeing this text, you have followed the symlink. # # This is a dynamic resolv.conf file for connecting local clients to the # internal DNS stub resolver of systemd-resolved. This file lists all # configured search domains. # # Run "resolvectl status" to see details about the uplink DNS servers # currently in use. # # Third party programs should typically not access this file directly, but only # through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a # different way, replace this symlink by a static file or a different symlink. # # See man:systemd-resolved.service(8) for details about the supported modes of # operation for /etc/resolv.conf. nameserver 127.0.0.53 options edns0 trust-ad search us-east-1.aws.redhat.com 15980 1727204191.00681: no more pending results, returning what we have 15980 1727204191.00684: results queue empty 15980 1727204191.00685: checking for any_errors_fatal 15980 1727204191.00688: done checking for any_errors_fatal 15980 1727204191.00689: checking for max_fail_percentage 15980 1727204191.00691: done checking for max_fail_percentage 15980 1727204191.00691: checking to see if all hosts have failed and the running result is not ok 15980 1727204191.00692: done checking to see if all hosts have failed 15980 1727204191.00693: getting the remaining hosts for this loop 15980 1727204191.00695: done getting the remaining hosts for this loop 15980 1727204191.00699: getting the next task for host managed-node2 15980 1727204191.00705: done getting next task for host managed-node2 15980 1727204191.00713: ^ task is: TASK: Verify DNS and network connectivity 15980 1727204191.00715: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204191.00719: getting variables 15980 1727204191.00721: in VariableManager get_vars() 15980 1727204191.00749: Calling all_inventory to load vars for managed-node2 15980 1727204191.00752: Calling groups_inventory to load vars for managed-node2 15980 1727204191.00755: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204191.00778: Calling all_plugins_play to load vars for managed-node2 15980 1727204191.00782: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204191.00786: Calling groups_plugins_play to load vars for managed-node2 15980 1727204191.02301: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204191.03615: done with get_vars() 15980 1727204191.03637: done getting variables 15980 1727204191.03689: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Tuesday 24 September 2024 14:56:31 -0400 (0:00:00.393) 0:00:52.447 ***** 15980 1727204191.03715: entering _queue_task() for managed-node2/shell 15980 1727204191.04006: worker is 1 (out of 1 available) 15980 1727204191.04021: exiting _queue_task() for managed-node2/shell 15980 1727204191.04036: done queuing things up, now waiting for results queue to drain 15980 1727204191.04038: waiting for pending results... 15980 1727204191.04232: running TaskExecutor() for managed-node2/TASK: Verify DNS and network connectivity 15980 1727204191.04315: in run() - task 127b8e07-fff9-5f1d-4b72-00000000050c 15980 1727204191.04335: variable 'ansible_search_path' from source: unknown 15980 1727204191.04339: variable 'ansible_search_path' from source: unknown 15980 1727204191.04368: calling self._execute() 15980 1727204191.04448: variable 'ansible_host' from source: host vars for 'managed-node2' 15980 1727204191.04452: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 15980 1727204191.04464: variable 'omit' from source: magic vars 15980 1727204191.04776: variable 'ansible_distribution_major_version' from source: facts 15980 1727204191.04786: Evaluated conditional (ansible_distribution_major_version != '6'): True 15980 1727204191.04931: variable 'ansible_facts' from source: unknown 15980 1727204191.05973: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): False 15980 1727204191.05977: when evaluation is False, skipping this task 15980 1727204191.05981: _execute() done 15980 1727204191.05983: dumping result to json 15980 1727204191.05986: done dumping result, returning 15980 1727204191.05988: done running TaskExecutor() for managed-node2/TASK: Verify DNS and network connectivity [127b8e07-fff9-5f1d-4b72-00000000050c] 15980 1727204191.05991: sending task result for task 127b8e07-fff9-5f1d-4b72-00000000050c 15980 1727204191.06070: done sending task result for task 127b8e07-fff9-5f1d-4b72-00000000050c 15980 1727204191.06074: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_facts[\"distribution\"] == \"CentOS\"", "skip_reason": "Conditional result was False" } 15980 1727204191.06130: no more pending results, returning what we have 15980 1727204191.06135: results queue empty 15980 1727204191.06136: checking for any_errors_fatal 15980 1727204191.06241: done checking for any_errors_fatal 15980 1727204191.06243: checking for max_fail_percentage 15980 1727204191.06245: done checking for max_fail_percentage 15980 1727204191.06245: checking to see if all hosts have failed and the running result is not ok 15980 1727204191.06246: done checking to see if all hosts have failed 15980 1727204191.06247: getting the remaining hosts for this loop 15980 1727204191.06248: done getting the remaining hosts for this loop 15980 1727204191.06252: getting the next task for host managed-node2 15980 1727204191.06268: done getting next task for host managed-node2 15980 1727204191.06271: ^ task is: TASK: meta (flush_handlers) 15980 1727204191.06273: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204191.06277: getting variables 15980 1727204191.06279: in VariableManager get_vars() 15980 1727204191.06306: Calling all_inventory to load vars for managed-node2 15980 1727204191.06309: Calling groups_inventory to load vars for managed-node2 15980 1727204191.06312: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204191.06323: Calling all_plugins_play to load vars for managed-node2 15980 1727204191.06329: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204191.06333: Calling groups_plugins_play to load vars for managed-node2 15980 1727204191.08207: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204191.10608: done with get_vars() 15980 1727204191.10661: done getting variables 15980 1727204191.10757: in VariableManager get_vars() 15980 1727204191.10772: Calling all_inventory to load vars for managed-node2 15980 1727204191.10775: Calling groups_inventory to load vars for managed-node2 15980 1727204191.10777: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204191.10783: Calling all_plugins_play to load vars for managed-node2 15980 1727204191.10785: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204191.10788: Calling groups_plugins_play to load vars for managed-node2 15980 1727204191.12533: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204191.14947: done with get_vars() 15980 1727204191.15001: done queuing things up, now waiting for results queue to drain 15980 1727204191.15004: results queue empty 15980 1727204191.15004: checking for any_errors_fatal 15980 1727204191.15008: done checking for any_errors_fatal 15980 1727204191.15008: checking for max_fail_percentage 15980 1727204191.15009: done checking for max_fail_percentage 15980 1727204191.15010: checking to see if all hosts have failed and the running result is not ok 15980 1727204191.15011: done checking to see if all hosts have failed 15980 1727204191.15012: getting the remaining hosts for this loop 15980 1727204191.15012: done getting the remaining hosts for this loop 15980 1727204191.15016: getting the next task for host managed-node2 15980 1727204191.15020: done getting next task for host managed-node2 15980 1727204191.15022: ^ task is: TASK: meta (flush_handlers) 15980 1727204191.15023: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204191.15028: getting variables 15980 1727204191.15029: in VariableManager get_vars() 15980 1727204191.15039: Calling all_inventory to load vars for managed-node2 15980 1727204191.15042: Calling groups_inventory to load vars for managed-node2 15980 1727204191.15044: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204191.15058: Calling all_plugins_play to load vars for managed-node2 15980 1727204191.15060: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204191.15064: Calling groups_plugins_play to load vars for managed-node2 15980 1727204191.16964: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204191.19595: done with get_vars() 15980 1727204191.19641: done getting variables 15980 1727204191.19719: in VariableManager get_vars() 15980 1727204191.19735: Calling all_inventory to load vars for managed-node2 15980 1727204191.19738: Calling groups_inventory to load vars for managed-node2 15980 1727204191.19740: Calling all_plugins_inventory to load vars for managed-node2 15980 1727204191.19746: Calling all_plugins_play to load vars for managed-node2 15980 1727204191.19748: Calling groups_plugins_inventory to load vars for managed-node2 15980 1727204191.19751: Calling groups_plugins_play to load vars for managed-node2 15980 1727204191.21590: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15980 1727204191.24075: done with get_vars() 15980 1727204191.24121: done queuing things up, now waiting for results queue to drain 15980 1727204191.24123: results queue empty 15980 1727204191.24127: checking for any_errors_fatal 15980 1727204191.24128: done checking for any_errors_fatal 15980 1727204191.24129: checking for max_fail_percentage 15980 1727204191.24130: done checking for max_fail_percentage 15980 1727204191.24131: checking to see if all hosts have failed and the running result is not ok 15980 1727204191.24132: done checking to see if all hosts have failed 15980 1727204191.24133: getting the remaining hosts for this loop 15980 1727204191.24134: done getting the remaining hosts for this loop 15980 1727204191.24144: getting the next task for host managed-node2 15980 1727204191.24156: done getting next task for host managed-node2 15980 1727204191.24157: ^ task is: None 15980 1727204191.24159: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15980 1727204191.24160: done queuing things up, now waiting for results queue to drain 15980 1727204191.24161: results queue empty 15980 1727204191.24162: checking for any_errors_fatal 15980 1727204191.24163: done checking for any_errors_fatal 15980 1727204191.24163: checking for max_fail_percentage 15980 1727204191.24165: done checking for max_fail_percentage 15980 1727204191.24168: checking to see if all hosts have failed and the running result is not ok 15980 1727204191.24169: done checking to see if all hosts have failed 15980 1727204191.24170: getting the next task for host managed-node2 15980 1727204191.24173: done getting next task for host managed-node2 15980 1727204191.24174: ^ task is: None 15980 1727204191.24175: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed-node2 : ok=81 changed=3 unreachable=0 failed=0 skipped=72 rescued=0 ignored=2 Tuesday 24 September 2024 14:56:31 -0400 (0:00:00.205) 0:00:52.652 ***** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 2.68s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.62s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.56s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 2.41s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml:6 fedora.linux_system_roles.network : Check which packages are installed --- 1.55s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.37s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.25s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Re-test connectivity ---------------- 1.22s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Gathering Facts --------------------------------------------------------- 1.19s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:3 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.16s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Gathering Facts --------------------------------------------------------- 1.12s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 1.10s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.09s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:17 Gathering Facts --------------------------------------------------------- 1.09s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Gathering Facts --------------------------------------------------------- 1.05s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:64 Gathering Facts --------------------------------------------------------- 1.03s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 Gathering Facts --------------------------------------------------------- 1.03s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Gathering Facts --------------------------------------------------------- 1.02s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 0.99s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 0.94s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 15980 1727204191.24303: RUNNING CLEANUP